id stringlengths 2 115 | lastModified stringlengths 24 24 | tags list | author stringlengths 2 42 ⌀ | description stringlengths 0 68.7k ⌀ | citation stringlengths 0 10.7k ⌀ | cardData null | likes int64 0 3.55k | downloads int64 0 10.1M | card stringlengths 0 1.01M |
|---|---|---|---|---|---|---|---|---|---|
open-llm-leaderboard/details_PulsarAI__Chat-AYB-Nova-13B | 2023-10-08T14:45:57.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of PulsarAI/Chat-AYB-Nova-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [PulsarAI/Chat-AYB-Nova-13B](https://huggingface.co/PulsarAI/Chat-AYB-Nova-13B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PulsarAI__Chat-AYB-Nova-13B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-08T14:44:32.660445](https://huggingface.co/datasets/open-llm-leaderboard/details_PulsarAI__Chat-AYB-Nova-13B/blob/main/results_2023-10-08T14-44-32.660445.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.586765594212576,\n\
\ \"acc_stderr\": 0.03410914044006396,\n \"acc_norm\": 0.5909137414970759,\n\
\ \"acc_norm_stderr\": 0.03408474084967116,\n \"mc1\": 0.3561811505507956,\n\
\ \"mc1_stderr\": 0.016763790728446335,\n \"mc2\": 0.5128475358249373,\n\
\ \"mc2_stderr\": 0.015464651537836402\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5836177474402731,\n \"acc_stderr\": 0.01440561827943617,\n\
\ \"acc_norm\": 0.6296928327645052,\n \"acc_norm_stderr\": 0.01411129875167495\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6440948018323043,\n\
\ \"acc_stderr\": 0.004778081784542406,\n \"acc_norm\": 0.842760406293567,\n\
\ \"acc_norm_stderr\": 0.0036328254791285954\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4962962962962963,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.4962962962962963,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5789473684210527,\n \"acc_stderr\": 0.04017901275981749,\n\
\ \"acc_norm\": 0.5789473684210527,\n \"acc_norm_stderr\": 0.04017901275981749\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6150943396226415,\n \"acc_stderr\": 0.02994649856769995,\n\
\ \"acc_norm\": 0.6150943396226415,\n \"acc_norm_stderr\": 0.02994649856769995\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6527777777777778,\n\
\ \"acc_stderr\": 0.0398124054371786,\n \"acc_norm\": 0.6527777777777778,\n\
\ \"acc_norm_stderr\": 0.0398124054371786\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5606936416184971,\n\
\ \"acc_stderr\": 0.037842719328874674,\n \"acc_norm\": 0.5606936416184971,\n\
\ \"acc_norm_stderr\": 0.037842719328874674\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.67,\n \"acc_stderr\": 0.047258156262526094,\n \"acc_norm\": 0.67,\n\
\ \"acc_norm_stderr\": 0.047258156262526094\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.49361702127659574,\n \"acc_stderr\": 0.032683358999363366,\n\
\ \"acc_norm\": 0.49361702127659574,\n \"acc_norm_stderr\": 0.032683358999363366\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n\
\ \"acc_stderr\": 0.04339138322579861,\n \"acc_norm\": 0.30701754385964913,\n\
\ \"acc_norm_stderr\": 0.04339138322579861\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.46206896551724136,\n \"acc_stderr\": 0.04154659671707546,\n\
\ \"acc_norm\": 0.46206896551724136,\n \"acc_norm_stderr\": 0.04154659671707546\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3333333333333333,\n \"acc_stderr\": 0.024278568024307706,\n \"\
acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.024278568024307706\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\
\ \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n\
\ \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6774193548387096,\n\
\ \"acc_stderr\": 0.026593084516572277,\n \"acc_norm\": 0.6774193548387096,\n\
\ \"acc_norm_stderr\": 0.026593084516572277\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4729064039408867,\n \"acc_stderr\": 0.03512819077876106,\n\
\ \"acc_norm\": 0.4729064039408867,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\"\
: 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7151515151515152,\n \"acc_stderr\": 0.03524390844511781,\n\
\ \"acc_norm\": 0.7151515151515152,\n \"acc_norm_stderr\": 0.03524390844511781\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7525252525252525,\n \"acc_stderr\": 0.030746300742124498,\n \"\
acc_norm\": 0.7525252525252525,\n \"acc_norm_stderr\": 0.030746300742124498\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8290155440414507,\n \"acc_stderr\": 0.027171213683164552,\n\
\ \"acc_norm\": 0.8290155440414507,\n \"acc_norm_stderr\": 0.027171213683164552\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6076923076923076,\n \"acc_stderr\": 0.024756000382130952,\n\
\ \"acc_norm\": 0.6076923076923076,\n \"acc_norm_stderr\": 0.024756000382130952\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683515,\n \
\ \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683515\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6386554621848739,\n \"acc_stderr\": 0.031204691225150013,\n\
\ \"acc_norm\": 0.6386554621848739,\n \"acc_norm_stderr\": 0.031204691225150013\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7926605504587156,\n \"acc_stderr\": 0.01738141556360868,\n \"\
acc_norm\": 0.7926605504587156,\n \"acc_norm_stderr\": 0.01738141556360868\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5462962962962963,\n \"acc_stderr\": 0.033953227263757976,\n \"\
acc_norm\": 0.5462962962962963,\n \"acc_norm_stderr\": 0.033953227263757976\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.803921568627451,\n \"acc_stderr\": 0.027865942286639325,\n \"\
acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639325\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229966,\n \
\ \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229966\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.648854961832061,\n \"acc_stderr\": 0.04186445163013751,\n\
\ \"acc_norm\": 0.648854961832061,\n \"acc_norm_stderr\": 0.04186445163013751\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6942148760330579,\n \"acc_stderr\": 0.04205953933884122,\n \"\
acc_norm\": 0.6942148760330579,\n \"acc_norm_stderr\": 0.04205953933884122\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7239263803680982,\n \"acc_stderr\": 0.035123852837050475,\n\
\ \"acc_norm\": 0.7239263803680982,\n \"acc_norm_stderr\": 0.035123852837050475\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n\
\ \"acc_stderr\": 0.04464285714285712,\n \"acc_norm\": 0.33035714285714285,\n\
\ \"acc_norm_stderr\": 0.04464285714285712\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8247863247863247,\n\
\ \"acc_stderr\": 0.02490443909891823,\n \"acc_norm\": 0.8247863247863247,\n\
\ \"acc_norm_stderr\": 0.02490443909891823\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7624521072796935,\n\
\ \"acc_stderr\": 0.015218733046150191,\n \"acc_norm\": 0.7624521072796935,\n\
\ \"acc_norm_stderr\": 0.015218733046150191\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6416184971098265,\n \"acc_stderr\": 0.025816756791584187,\n\
\ \"acc_norm\": 0.6416184971098265,\n \"acc_norm_stderr\": 0.025816756791584187\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.38212290502793295,\n\
\ \"acc_stderr\": 0.01625113971157077,\n \"acc_norm\": 0.38212290502793295,\n\
\ \"acc_norm_stderr\": 0.01625113971157077\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6143790849673203,\n \"acc_stderr\": 0.02787074527829028,\n\
\ \"acc_norm\": 0.6143790849673203,\n \"acc_norm_stderr\": 0.02787074527829028\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6655948553054662,\n\
\ \"acc_stderr\": 0.026795422327893937,\n \"acc_norm\": 0.6655948553054662,\n\
\ \"acc_norm_stderr\": 0.026795422327893937\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6820987654320988,\n \"acc_stderr\": 0.02591006352824089,\n\
\ \"acc_norm\": 0.6820987654320988,\n \"acc_norm_stderr\": 0.02591006352824089\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.475177304964539,\n \"acc_stderr\": 0.029790719243829727,\n \
\ \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.029790719243829727\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.43415906127770537,\n\
\ \"acc_stderr\": 0.012659033237067248,\n \"acc_norm\": 0.43415906127770537,\n\
\ \"acc_norm_stderr\": 0.012659033237067248\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5772058823529411,\n \"acc_stderr\": 0.030008562845003486,\n\
\ \"acc_norm\": 0.5772058823529411,\n \"acc_norm_stderr\": 0.030008562845003486\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5735294117647058,\n \"acc_stderr\": 0.020007912739359365,\n \
\ \"acc_norm\": 0.5735294117647058,\n \"acc_norm_stderr\": 0.020007912739359365\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6244897959183674,\n \"acc_stderr\": 0.03100120903989484,\n\
\ \"acc_norm\": 0.6244897959183674,\n \"acc_norm_stderr\": 0.03100120903989484\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7512437810945274,\n\
\ \"acc_stderr\": 0.030567675938916714,\n \"acc_norm\": 0.7512437810945274,\n\
\ \"acc_norm_stderr\": 0.030567675938916714\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.035887028128263686,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.035887028128263686\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.46987951807228917,\n\
\ \"acc_stderr\": 0.03885425420866766,\n \"acc_norm\": 0.46987951807228917,\n\
\ \"acc_norm_stderr\": 0.03885425420866766\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7660818713450293,\n \"acc_stderr\": 0.03246721765117826,\n\
\ \"acc_norm\": 0.7660818713450293,\n \"acc_norm_stderr\": 0.03246721765117826\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3561811505507956,\n\
\ \"mc1_stderr\": 0.016763790728446335,\n \"mc2\": 0.5128475358249373,\n\
\ \"mc2_stderr\": 0.015464651537836402\n }\n}\n```"
repo_url: https://huggingface.co/PulsarAI/Chat-AYB-Nova-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_08T14_44_32.660445
path:
- '**/details_harness|arc:challenge|25_2023-10-08T14-44-32.660445.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-08T14-44-32.660445.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_08T14_44_32.660445
path:
- '**/details_harness|hellaswag|10_2023-10-08T14-44-32.660445.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-08T14-44-32.660445.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_08T14_44_32.660445
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T14-44-32.660445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T14-44-32.660445.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T14-44-32.660445.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_08T14_44_32.660445
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T14-44-32.660445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T14-44-32.660445.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_08T14_44_32.660445
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T14-44-32.660445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T14-44-32.660445.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_08T14_44_32.660445
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T14-44-32.660445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T14-44-32.660445.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_08T14_44_32.660445
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T14-44-32.660445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T14-44-32.660445.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_08T14_44_32.660445
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T14-44-32.660445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T14-44-32.660445.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_08T14_44_32.660445
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T14-44-32.660445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T14-44-32.660445.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_08T14_44_32.660445
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T14-44-32.660445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T14-44-32.660445.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_08T14_44_32.660445
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T14-44-32.660445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T14-44-32.660445.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_08T14_44_32.660445
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T14-44-32.660445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T14-44-32.660445.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_08T14_44_32.660445
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T14-44-32.660445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T14-44-32.660445.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_08T14_44_32.660445
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T14-44-32.660445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T14-44-32.660445.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_08T14_44_32.660445
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T14-44-32.660445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T14-44-32.660445.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_08T14_44_32.660445
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T14-44-32.660445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T14-44-32.660445.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_08T14_44_32.660445
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T14-44-32.660445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T14-44-32.660445.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_08T14_44_32.660445
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T14-44-32.660445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T14-44-32.660445.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_08T14_44_32.660445
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T14-44-32.660445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T14-44-32.660445.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_08T14_44_32.660445
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T14-44-32.660445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T14-44-32.660445.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_08T14_44_32.660445
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T14-44-32.660445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T14-44-32.660445.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_08T14_44_32.660445
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T14-44-32.660445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T14-44-32.660445.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_08T14_44_32.660445
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T14-44-32.660445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T14-44-32.660445.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_08T14_44_32.660445
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T14-44-32.660445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T14-44-32.660445.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_08T14_44_32.660445
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T14-44-32.660445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T14-44-32.660445.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_08T14_44_32.660445
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T14-44-32.660445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T14-44-32.660445.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_08T14_44_32.660445
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T14-44-32.660445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T14-44-32.660445.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_08T14_44_32.660445
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T14-44-32.660445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T14-44-32.660445.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_08T14_44_32.660445
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T14-44-32.660445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T14-44-32.660445.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_08T14_44_32.660445
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T14-44-32.660445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T14-44-32.660445.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_08T14_44_32.660445
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T14-44-32.660445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T14-44-32.660445.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_08T14_44_32.660445
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T14-44-32.660445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T14-44-32.660445.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_08T14_44_32.660445
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T14-44-32.660445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T14-44-32.660445.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_08T14_44_32.660445
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T14-44-32.660445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T14-44-32.660445.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_08T14_44_32.660445
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T14-44-32.660445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T14-44-32.660445.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_08T14_44_32.660445
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T14-44-32.660445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T14-44-32.660445.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_08T14_44_32.660445
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T14-44-32.660445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T14-44-32.660445.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_08T14_44_32.660445
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T14-44-32.660445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T14-44-32.660445.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_08T14_44_32.660445
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T14-44-32.660445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T14-44-32.660445.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_08T14_44_32.660445
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T14-44-32.660445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T14-44-32.660445.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_08T14_44_32.660445
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T14-44-32.660445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T14-44-32.660445.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_08T14_44_32.660445
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-08T14-44-32.660445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-08T14-44-32.660445.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_08T14_44_32.660445
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T14-44-32.660445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T14-44-32.660445.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_08T14_44_32.660445
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T14-44-32.660445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T14-44-32.660445.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_08T14_44_32.660445
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T14-44-32.660445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T14-44-32.660445.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_08T14_44_32.660445
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T14-44-32.660445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T14-44-32.660445.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_08T14_44_32.660445
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T14-44-32.660445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T14-44-32.660445.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_08T14_44_32.660445
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T14-44-32.660445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T14-44-32.660445.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_08T14_44_32.660445
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T14-44-32.660445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T14-44-32.660445.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_08T14_44_32.660445
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T14-44-32.660445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T14-44-32.660445.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_08T14_44_32.660445
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T14-44-32.660445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T14-44-32.660445.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_08T14_44_32.660445
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T14-44-32.660445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T14-44-32.660445.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_08T14_44_32.660445
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T14-44-32.660445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T14-44-32.660445.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_08T14_44_32.660445
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T14-44-32.660445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T14-44-32.660445.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_08T14_44_32.660445
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T14-44-32.660445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T14-44-32.660445.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_08T14_44_32.660445
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T14-44-32.660445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T14-44-32.660445.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_08T14_44_32.660445
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T14-44-32.660445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T14-44-32.660445.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_08T14_44_32.660445
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T14-44-32.660445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T14-44-32.660445.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_08T14_44_32.660445
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T14-44-32.660445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T14-44-32.660445.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_08T14_44_32.660445
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T14-44-32.660445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T14-44-32.660445.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_08T14_44_32.660445
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-08T14-44-32.660445.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-08T14-44-32.660445.parquet'
- config_name: results
data_files:
- split: 2023_10_08T14_44_32.660445
path:
- results_2023-10-08T14-44-32.660445.parquet
- split: latest
path:
- results_2023-10-08T14-44-32.660445.parquet
---
# Dataset Card for Evaluation run of PulsarAI/Chat-AYB-Nova-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/PulsarAI/Chat-AYB-Nova-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [PulsarAI/Chat-AYB-Nova-13B](https://huggingface.co/PulsarAI/Chat-AYB-Nova-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_PulsarAI__Chat-AYB-Nova-13B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-08T14:44:32.660445](https://huggingface.co/datasets/open-llm-leaderboard/details_PulsarAI__Chat-AYB-Nova-13B/blob/main/results_2023-10-08T14-44-32.660445.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.586765594212576,
"acc_stderr": 0.03410914044006396,
"acc_norm": 0.5909137414970759,
"acc_norm_stderr": 0.03408474084967116,
"mc1": 0.3561811505507956,
"mc1_stderr": 0.016763790728446335,
"mc2": 0.5128475358249373,
"mc2_stderr": 0.015464651537836402
},
"harness|arc:challenge|25": {
"acc": 0.5836177474402731,
"acc_stderr": 0.01440561827943617,
"acc_norm": 0.6296928327645052,
"acc_norm_stderr": 0.01411129875167495
},
"harness|hellaswag|10": {
"acc": 0.6440948018323043,
"acc_stderr": 0.004778081784542406,
"acc_norm": 0.842760406293567,
"acc_norm_stderr": 0.0036328254791285954
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4962962962962963,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.4962962962962963,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5789473684210527,
"acc_stderr": 0.04017901275981749,
"acc_norm": 0.5789473684210527,
"acc_norm_stderr": 0.04017901275981749
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6150943396226415,
"acc_stderr": 0.02994649856769995,
"acc_norm": 0.6150943396226415,
"acc_norm_stderr": 0.02994649856769995
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6527777777777778,
"acc_stderr": 0.0398124054371786,
"acc_norm": 0.6527777777777778,
"acc_norm_stderr": 0.0398124054371786
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5606936416184971,
"acc_stderr": 0.037842719328874674,
"acc_norm": 0.5606936416184971,
"acc_norm_stderr": 0.037842719328874674
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526094,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526094
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.49361702127659574,
"acc_stderr": 0.032683358999363366,
"acc_norm": 0.49361702127659574,
"acc_norm_stderr": 0.032683358999363366
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.04339138322579861,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.04339138322579861
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.46206896551724136,
"acc_stderr": 0.04154659671707546,
"acc_norm": 0.46206896551724136,
"acc_norm_stderr": 0.04154659671707546
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.024278568024307706,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.024278568024307706
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6774193548387096,
"acc_stderr": 0.026593084516572277,
"acc_norm": 0.6774193548387096,
"acc_norm_stderr": 0.026593084516572277
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4729064039408867,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.4729064039408867,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7151515151515152,
"acc_stderr": 0.03524390844511781,
"acc_norm": 0.7151515151515152,
"acc_norm_stderr": 0.03524390844511781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7525252525252525,
"acc_stderr": 0.030746300742124498,
"acc_norm": 0.7525252525252525,
"acc_norm_stderr": 0.030746300742124498
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8290155440414507,
"acc_stderr": 0.027171213683164552,
"acc_norm": 0.8290155440414507,
"acc_norm_stderr": 0.027171213683164552
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6076923076923076,
"acc_stderr": 0.024756000382130952,
"acc_norm": 0.6076923076923076,
"acc_norm_stderr": 0.024756000382130952
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.028226446749683515,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.028226446749683515
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6386554621848739,
"acc_stderr": 0.031204691225150013,
"acc_norm": 0.6386554621848739,
"acc_norm_stderr": 0.031204691225150013
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7926605504587156,
"acc_stderr": 0.01738141556360868,
"acc_norm": 0.7926605504587156,
"acc_norm_stderr": 0.01738141556360868
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.033953227263757976,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.033953227263757976
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.027865942286639325,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.027865942286639325
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7763713080168776,
"acc_stderr": 0.027123298205229966,
"acc_norm": 0.7763713080168776,
"acc_norm_stderr": 0.027123298205229966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.648854961832061,
"acc_stderr": 0.04186445163013751,
"acc_norm": 0.648854961832061,
"acc_norm_stderr": 0.04186445163013751
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6942148760330579,
"acc_stderr": 0.04205953933884122,
"acc_norm": 0.6942148760330579,
"acc_norm_stderr": 0.04205953933884122
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7239263803680982,
"acc_stderr": 0.035123852837050475,
"acc_norm": 0.7239263803680982,
"acc_norm_stderr": 0.035123852837050475
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.33035714285714285,
"acc_stderr": 0.04464285714285712,
"acc_norm": 0.33035714285714285,
"acc_norm_stderr": 0.04464285714285712
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8247863247863247,
"acc_stderr": 0.02490443909891823,
"acc_norm": 0.8247863247863247,
"acc_norm_stderr": 0.02490443909891823
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7624521072796935,
"acc_stderr": 0.015218733046150191,
"acc_norm": 0.7624521072796935,
"acc_norm_stderr": 0.015218733046150191
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.025816756791584187,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.025816756791584187
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.38212290502793295,
"acc_stderr": 0.01625113971157077,
"acc_norm": 0.38212290502793295,
"acc_norm_stderr": 0.01625113971157077
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6143790849673203,
"acc_stderr": 0.02787074527829028,
"acc_norm": 0.6143790849673203,
"acc_norm_stderr": 0.02787074527829028
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6655948553054662,
"acc_stderr": 0.026795422327893937,
"acc_norm": 0.6655948553054662,
"acc_norm_stderr": 0.026795422327893937
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6820987654320988,
"acc_stderr": 0.02591006352824089,
"acc_norm": 0.6820987654320988,
"acc_norm_stderr": 0.02591006352824089
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.029790719243829727,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.029790719243829727
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.43415906127770537,
"acc_stderr": 0.012659033237067248,
"acc_norm": 0.43415906127770537,
"acc_norm_stderr": 0.012659033237067248
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5772058823529411,
"acc_stderr": 0.030008562845003486,
"acc_norm": 0.5772058823529411,
"acc_norm_stderr": 0.030008562845003486
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5735294117647058,
"acc_stderr": 0.020007912739359365,
"acc_norm": 0.5735294117647058,
"acc_norm_stderr": 0.020007912739359365
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6244897959183674,
"acc_stderr": 0.03100120903989484,
"acc_norm": 0.6244897959183674,
"acc_norm_stderr": 0.03100120903989484
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7512437810945274,
"acc_stderr": 0.030567675938916714,
"acc_norm": 0.7512437810945274,
"acc_norm_stderr": 0.030567675938916714
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.035887028128263686,
"acc_norm": 0.85,
"acc_norm_stderr": 0.035887028128263686
},
"harness|hendrycksTest-virology|5": {
"acc": 0.46987951807228917,
"acc_stderr": 0.03885425420866766,
"acc_norm": 0.46987951807228917,
"acc_norm_stderr": 0.03885425420866766
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7660818713450293,
"acc_stderr": 0.03246721765117826,
"acc_norm": 0.7660818713450293,
"acc_norm_stderr": 0.03246721765117826
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3561811505507956,
"mc1_stderr": 0.016763790728446335,
"mc2": 0.5128475358249373,
"mc2_stderr": 0.015464651537836402
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_PulsarAI__Chat-AYB-Platypus2-13B | 2023-10-08T14:47:28.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of PulsarAI/Chat-AYB-Platypus2-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [PulsarAI/Chat-AYB-Platypus2-13B](https://huggingface.co/PulsarAI/Chat-AYB-Platypus2-13B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PulsarAI__Chat-AYB-Platypus2-13B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-08T14:46:05.202813](https://huggingface.co/datasets/open-llm-leaderboard/details_PulsarAI__Chat-AYB-Platypus2-13B/blob/main/results_2023-10-08T14-46-05.202813.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5793506755644279,\n\
\ \"acc_stderr\": 0.03418617983940258,\n \"acc_norm\": 0.5831649759747356,\n\
\ \"acc_norm_stderr\": 0.03416450490851953,\n \"mc1\": 0.3806609547123623,\n\
\ \"mc1_stderr\": 0.01699762787190793,\n \"mc2\": 0.5451670851918026,\n\
\ \"mc2_stderr\": 0.01582581744184166\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5750853242320819,\n \"acc_stderr\": 0.014445698968520765,\n\
\ \"acc_norm\": 0.6049488054607508,\n \"acc_norm_stderr\": 0.014285898292938163\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6450906193985262,\n\
\ \"acc_stderr\": 0.0047750796365670966,\n \"acc_norm\": 0.8402708623780123,\n\
\ \"acc_norm_stderr\": 0.0036560593900501065\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.04605661864718381,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.04605661864718381\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5037037037037037,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.5037037037037037,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6118421052631579,\n \"acc_stderr\": 0.03965842097512744,\n\
\ \"acc_norm\": 0.6118421052631579,\n \"acc_norm_stderr\": 0.03965842097512744\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5660377358490566,\n \"acc_stderr\": 0.03050329201334259,\n\
\ \"acc_norm\": 0.5660377358490566,\n \"acc_norm_stderr\": 0.03050329201334259\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.625,\n\
\ \"acc_stderr\": 0.04048439222695598,\n \"acc_norm\": 0.625,\n \
\ \"acc_norm_stderr\": 0.04048439222695598\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\"\
: 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5375722543352601,\n\
\ \"acc_stderr\": 0.0380168510452446,\n \"acc_norm\": 0.5375722543352601,\n\
\ \"acc_norm_stderr\": 0.0380168510452446\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.04488482852329017,\n\
\ \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.04488482852329017\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n\
\ \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4808510638297872,\n \"acc_stderr\": 0.032662042990646775,\n\
\ \"acc_norm\": 0.4808510638297872,\n \"acc_norm_stderr\": 0.032662042990646775\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.044346007015849245,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.044346007015849245\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3201058201058201,\n \"acc_stderr\": 0.0240268463928735,\n \"acc_norm\"\
: 0.3201058201058201,\n \"acc_norm_stderr\": 0.0240268463928735\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n\
\ \"acc_stderr\": 0.043435254289490965,\n \"acc_norm\": 0.38095238095238093,\n\
\ \"acc_norm_stderr\": 0.043435254289490965\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6258064516129033,\n\
\ \"acc_stderr\": 0.027528904299845697,\n \"acc_norm\": 0.6258064516129033,\n\
\ \"acc_norm_stderr\": 0.027528904299845697\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.034819048444388045,\n\
\ \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.034819048444388045\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\"\
: 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7393939393939394,\n \"acc_stderr\": 0.03427743175816524,\n\
\ \"acc_norm\": 0.7393939393939394,\n \"acc_norm_stderr\": 0.03427743175816524\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7373737373737373,\n \"acc_stderr\": 0.031353050095330855,\n \"\
acc_norm\": 0.7373737373737373,\n \"acc_norm_stderr\": 0.031353050095330855\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.024639789097709443,\n\
\ \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.024639789097709443\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5897435897435898,\n \"acc_stderr\": 0.024939313906940784,\n\
\ \"acc_norm\": 0.5897435897435898,\n \"acc_norm_stderr\": 0.024939313906940784\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948496,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948496\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.0322529423239964,\n \
\ \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.0322529423239964\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7908256880733945,\n \"acc_stderr\": 0.017437937173343233,\n \"\
acc_norm\": 0.7908256880733945,\n \"acc_norm_stderr\": 0.017437937173343233\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4398148148148148,\n \"acc_stderr\": 0.03385177976044812,\n \"\
acc_norm\": 0.4398148148148148,\n \"acc_norm_stderr\": 0.03385177976044812\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7990196078431373,\n \"acc_stderr\": 0.028125972265654366,\n \"\
acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.028125972265654366\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.759493670886076,\n \"acc_stderr\": 0.027820781981149685,\n \
\ \"acc_norm\": 0.759493670886076,\n \"acc_norm_stderr\": 0.027820781981149685\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n\
\ \"acc_stderr\": 0.03160295143776678,\n \"acc_norm\": 0.6681614349775785,\n\
\ \"acc_norm_stderr\": 0.03160295143776678\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6412213740458015,\n \"acc_stderr\": 0.04206739313864908,\n\
\ \"acc_norm\": 0.6412213740458015,\n \"acc_norm_stderr\": 0.04206739313864908\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7272727272727273,\n \"acc_stderr\": 0.04065578140908707,\n \"\
acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04065578140908707\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6625766871165644,\n \"acc_stderr\": 0.03714908409935574,\n\
\ \"acc_norm\": 0.6625766871165644,\n \"acc_norm_stderr\": 0.03714908409935574\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3482142857142857,\n\
\ \"acc_stderr\": 0.04521829902833585,\n \"acc_norm\": 0.3482142857142857,\n\
\ \"acc_norm_stderr\": 0.04521829902833585\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7087378640776699,\n \"acc_stderr\": 0.044986763205729224,\n\
\ \"acc_norm\": 0.7087378640776699,\n \"acc_norm_stderr\": 0.044986763205729224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8376068376068376,\n\
\ \"acc_stderr\": 0.02416161812798774,\n \"acc_norm\": 0.8376068376068376,\n\
\ \"acc_norm_stderr\": 0.02416161812798774\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7905491698595147,\n\
\ \"acc_stderr\": 0.014551310568143693,\n \"acc_norm\": 0.7905491698595147,\n\
\ \"acc_norm_stderr\": 0.014551310568143693\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6184971098265896,\n \"acc_stderr\": 0.026152198619726803,\n\
\ \"acc_norm\": 0.6184971098265896,\n \"acc_norm_stderr\": 0.026152198619726803\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4681564245810056,\n\
\ \"acc_stderr\": 0.016688553415612206,\n \"acc_norm\": 0.4681564245810056,\n\
\ \"acc_norm_stderr\": 0.016688553415612206\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.630718954248366,\n \"acc_stderr\": 0.027634176689602653,\n\
\ \"acc_norm\": 0.630718954248366,\n \"acc_norm_stderr\": 0.027634176689602653\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6559485530546624,\n\
\ \"acc_stderr\": 0.02698147804364803,\n \"acc_norm\": 0.6559485530546624,\n\
\ \"acc_norm_stderr\": 0.02698147804364803\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6790123456790124,\n \"acc_stderr\": 0.025976566010862744,\n\
\ \"acc_norm\": 0.6790123456790124,\n \"acc_norm_stderr\": 0.025976566010862744\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303055,\n \
\ \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303055\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45436766623207303,\n\
\ \"acc_stderr\": 0.012716941720734815,\n \"acc_norm\": 0.45436766623207303,\n\
\ \"acc_norm_stderr\": 0.012716941720734815\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5882352941176471,\n \"acc_stderr\": 0.02989616303312547,\n\
\ \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.02989616303312547\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5735294117647058,\n \"acc_stderr\": 0.020007912739359375,\n \
\ \"acc_norm\": 0.5735294117647058,\n \"acc_norm_stderr\": 0.020007912739359375\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6530612244897959,\n \"acc_stderr\": 0.030472526026726496,\n\
\ \"acc_norm\": 0.6530612244897959,\n \"acc_norm_stderr\": 0.030472526026726496\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6716417910447762,\n\
\ \"acc_stderr\": 0.033206858897443244,\n \"acc_norm\": 0.6716417910447762,\n\
\ \"acc_norm_stderr\": 0.033206858897443244\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4879518072289157,\n\
\ \"acc_stderr\": 0.03891364495835821,\n \"acc_norm\": 0.4879518072289157,\n\
\ \"acc_norm_stderr\": 0.03891364495835821\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3806609547123623,\n\
\ \"mc1_stderr\": 0.01699762787190793,\n \"mc2\": 0.5451670851918026,\n\
\ \"mc2_stderr\": 0.01582581744184166\n }\n}\n```"
repo_url: https://huggingface.co/PulsarAI/Chat-AYB-Platypus2-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_08T14_46_05.202813
path:
- '**/details_harness|arc:challenge|25_2023-10-08T14-46-05.202813.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-08T14-46-05.202813.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_08T14_46_05.202813
path:
- '**/details_harness|hellaswag|10_2023-10-08T14-46-05.202813.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-08T14-46-05.202813.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_08T14_46_05.202813
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T14-46-05.202813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T14-46-05.202813.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T14-46-05.202813.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_08T14_46_05.202813
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T14-46-05.202813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T14-46-05.202813.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_08T14_46_05.202813
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T14-46-05.202813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T14-46-05.202813.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_08T14_46_05.202813
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T14-46-05.202813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T14-46-05.202813.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_08T14_46_05.202813
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T14-46-05.202813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T14-46-05.202813.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_08T14_46_05.202813
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T14-46-05.202813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T14-46-05.202813.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_08T14_46_05.202813
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T14-46-05.202813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T14-46-05.202813.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_08T14_46_05.202813
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T14-46-05.202813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T14-46-05.202813.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_08T14_46_05.202813
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T14-46-05.202813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T14-46-05.202813.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_08T14_46_05.202813
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T14-46-05.202813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T14-46-05.202813.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_08T14_46_05.202813
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T14-46-05.202813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T14-46-05.202813.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_08T14_46_05.202813
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T14-46-05.202813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T14-46-05.202813.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_08T14_46_05.202813
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T14-46-05.202813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T14-46-05.202813.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_08T14_46_05.202813
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T14-46-05.202813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T14-46-05.202813.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_08T14_46_05.202813
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T14-46-05.202813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T14-46-05.202813.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_08T14_46_05.202813
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T14-46-05.202813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T14-46-05.202813.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_08T14_46_05.202813
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T14-46-05.202813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T14-46-05.202813.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_08T14_46_05.202813
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T14-46-05.202813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T14-46-05.202813.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_08T14_46_05.202813
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T14-46-05.202813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T14-46-05.202813.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_08T14_46_05.202813
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T14-46-05.202813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T14-46-05.202813.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_08T14_46_05.202813
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T14-46-05.202813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T14-46-05.202813.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_08T14_46_05.202813
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T14-46-05.202813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T14-46-05.202813.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_08T14_46_05.202813
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T14-46-05.202813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T14-46-05.202813.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_08T14_46_05.202813
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T14-46-05.202813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T14-46-05.202813.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_08T14_46_05.202813
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T14-46-05.202813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T14-46-05.202813.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_08T14_46_05.202813
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T14-46-05.202813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T14-46-05.202813.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_08T14_46_05.202813
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T14-46-05.202813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T14-46-05.202813.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_08T14_46_05.202813
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T14-46-05.202813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T14-46-05.202813.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_08T14_46_05.202813
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T14-46-05.202813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T14-46-05.202813.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_08T14_46_05.202813
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T14-46-05.202813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T14-46-05.202813.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_08T14_46_05.202813
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T14-46-05.202813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T14-46-05.202813.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_08T14_46_05.202813
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T14-46-05.202813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T14-46-05.202813.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_08T14_46_05.202813
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T14-46-05.202813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T14-46-05.202813.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_08T14_46_05.202813
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T14-46-05.202813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T14-46-05.202813.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_08T14_46_05.202813
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T14-46-05.202813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T14-46-05.202813.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_08T14_46_05.202813
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T14-46-05.202813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T14-46-05.202813.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_08T14_46_05.202813
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T14-46-05.202813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T14-46-05.202813.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_08T14_46_05.202813
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T14-46-05.202813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T14-46-05.202813.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_08T14_46_05.202813
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T14-46-05.202813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T14-46-05.202813.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_08T14_46_05.202813
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-08T14-46-05.202813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-08T14-46-05.202813.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_08T14_46_05.202813
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T14-46-05.202813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T14-46-05.202813.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_08T14_46_05.202813
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T14-46-05.202813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T14-46-05.202813.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_08T14_46_05.202813
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T14-46-05.202813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T14-46-05.202813.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_08T14_46_05.202813
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T14-46-05.202813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T14-46-05.202813.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_08T14_46_05.202813
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T14-46-05.202813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T14-46-05.202813.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_08T14_46_05.202813
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T14-46-05.202813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T14-46-05.202813.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_08T14_46_05.202813
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T14-46-05.202813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T14-46-05.202813.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_08T14_46_05.202813
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T14-46-05.202813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T14-46-05.202813.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_08T14_46_05.202813
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T14-46-05.202813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T14-46-05.202813.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_08T14_46_05.202813
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T14-46-05.202813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T14-46-05.202813.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_08T14_46_05.202813
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T14-46-05.202813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T14-46-05.202813.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_08T14_46_05.202813
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T14-46-05.202813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T14-46-05.202813.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_08T14_46_05.202813
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T14-46-05.202813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T14-46-05.202813.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_08T14_46_05.202813
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T14-46-05.202813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T14-46-05.202813.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_08T14_46_05.202813
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T14-46-05.202813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T14-46-05.202813.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_08T14_46_05.202813
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T14-46-05.202813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T14-46-05.202813.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_08T14_46_05.202813
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T14-46-05.202813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T14-46-05.202813.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_08T14_46_05.202813
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T14-46-05.202813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T14-46-05.202813.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_08T14_46_05.202813
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-08T14-46-05.202813.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-08T14-46-05.202813.parquet'
- config_name: results
data_files:
- split: 2023_10_08T14_46_05.202813
path:
- results_2023-10-08T14-46-05.202813.parquet
- split: latest
path:
- results_2023-10-08T14-46-05.202813.parquet
---
# Dataset Card for Evaluation run of PulsarAI/Chat-AYB-Platypus2-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/PulsarAI/Chat-AYB-Platypus2-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [PulsarAI/Chat-AYB-Platypus2-13B](https://huggingface.co/PulsarAI/Chat-AYB-Platypus2-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_PulsarAI__Chat-AYB-Platypus2-13B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-08T14:46:05.202813](https://huggingface.co/datasets/open-llm-leaderboard/details_PulsarAI__Chat-AYB-Platypus2-13B/blob/main/results_2023-10-08T14-46-05.202813.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5793506755644279,
"acc_stderr": 0.03418617983940258,
"acc_norm": 0.5831649759747356,
"acc_norm_stderr": 0.03416450490851953,
"mc1": 0.3806609547123623,
"mc1_stderr": 0.01699762787190793,
"mc2": 0.5451670851918026,
"mc2_stderr": 0.01582581744184166
},
"harness|arc:challenge|25": {
"acc": 0.5750853242320819,
"acc_stderr": 0.014445698968520765,
"acc_norm": 0.6049488054607508,
"acc_norm_stderr": 0.014285898292938163
},
"harness|hellaswag|10": {
"acc": 0.6450906193985262,
"acc_stderr": 0.0047750796365670966,
"acc_norm": 0.8402708623780123,
"acc_norm_stderr": 0.0036560593900501065
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.04605661864718381,
"acc_norm": 0.3,
"acc_norm_stderr": 0.04605661864718381
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5037037037037037,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.5037037037037037,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6118421052631579,
"acc_stderr": 0.03965842097512744,
"acc_norm": 0.6118421052631579,
"acc_norm_stderr": 0.03965842097512744
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5660377358490566,
"acc_stderr": 0.03050329201334259,
"acc_norm": 0.5660377358490566,
"acc_norm_stderr": 0.03050329201334259
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.625,
"acc_stderr": 0.04048439222695598,
"acc_norm": 0.625,
"acc_norm_stderr": 0.04048439222695598
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5375722543352601,
"acc_stderr": 0.0380168510452446,
"acc_norm": 0.5375722543352601,
"acc_norm_stderr": 0.0380168510452446
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.04488482852329017,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.04488482852329017
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4808510638297872,
"acc_stderr": 0.032662042990646775,
"acc_norm": 0.4808510638297872,
"acc_norm_stderr": 0.032662042990646775
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.044346007015849245,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.044346007015849245
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3201058201058201,
"acc_stderr": 0.0240268463928735,
"acc_norm": 0.3201058201058201,
"acc_norm_stderr": 0.0240268463928735
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.043435254289490965,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.043435254289490965
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6258064516129033,
"acc_stderr": 0.027528904299845697,
"acc_norm": 0.6258064516129033,
"acc_norm_stderr": 0.027528904299845697
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.034819048444388045,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.034819048444388045
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7393939393939394,
"acc_stderr": 0.03427743175816524,
"acc_norm": 0.7393939393939394,
"acc_norm_stderr": 0.03427743175816524
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7373737373737373,
"acc_stderr": 0.031353050095330855,
"acc_norm": 0.7373737373737373,
"acc_norm_stderr": 0.031353050095330855
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.024639789097709443,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.024639789097709443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5897435897435898,
"acc_stderr": 0.024939313906940784,
"acc_norm": 0.5897435897435898,
"acc_norm_stderr": 0.024939313906940784
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948496,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948496
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.0322529423239964,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.0322529423239964
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7908256880733945,
"acc_stderr": 0.017437937173343233,
"acc_norm": 0.7908256880733945,
"acc_norm_stderr": 0.017437937173343233
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4398148148148148,
"acc_stderr": 0.03385177976044812,
"acc_norm": 0.4398148148148148,
"acc_norm_stderr": 0.03385177976044812
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.028125972265654366,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.028125972265654366
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.759493670886076,
"acc_stderr": 0.027820781981149685,
"acc_norm": 0.759493670886076,
"acc_norm_stderr": 0.027820781981149685
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.03160295143776678,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.03160295143776678
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6412213740458015,
"acc_stderr": 0.04206739313864908,
"acc_norm": 0.6412213740458015,
"acc_norm_stderr": 0.04206739313864908
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04065578140908707,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04065578140908707
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6625766871165644,
"acc_stderr": 0.03714908409935574,
"acc_norm": 0.6625766871165644,
"acc_norm_stderr": 0.03714908409935574
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3482142857142857,
"acc_stderr": 0.04521829902833585,
"acc_norm": 0.3482142857142857,
"acc_norm_stderr": 0.04521829902833585
},
"harness|hendrycksTest-management|5": {
"acc": 0.7087378640776699,
"acc_stderr": 0.044986763205729224,
"acc_norm": 0.7087378640776699,
"acc_norm_stderr": 0.044986763205729224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8376068376068376,
"acc_stderr": 0.02416161812798774,
"acc_norm": 0.8376068376068376,
"acc_norm_stderr": 0.02416161812798774
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7905491698595147,
"acc_stderr": 0.014551310568143693,
"acc_norm": 0.7905491698595147,
"acc_norm_stderr": 0.014551310568143693
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.026152198619726803,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.026152198619726803
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4681564245810056,
"acc_stderr": 0.016688553415612206,
"acc_norm": 0.4681564245810056,
"acc_norm_stderr": 0.016688553415612206
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.630718954248366,
"acc_stderr": 0.027634176689602653,
"acc_norm": 0.630718954248366,
"acc_norm_stderr": 0.027634176689602653
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6559485530546624,
"acc_stderr": 0.02698147804364803,
"acc_norm": 0.6559485530546624,
"acc_norm_stderr": 0.02698147804364803
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6790123456790124,
"acc_stderr": 0.025976566010862744,
"acc_norm": 0.6790123456790124,
"acc_norm_stderr": 0.025976566010862744
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.029779450957303055,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.029779450957303055
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45436766623207303,
"acc_stderr": 0.012716941720734815,
"acc_norm": 0.45436766623207303,
"acc_norm_stderr": 0.012716941720734815
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5882352941176471,
"acc_stderr": 0.02989616303312547,
"acc_norm": 0.5882352941176471,
"acc_norm_stderr": 0.02989616303312547
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5735294117647058,
"acc_stderr": 0.020007912739359375,
"acc_norm": 0.5735294117647058,
"acc_norm_stderr": 0.020007912739359375
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6530612244897959,
"acc_stderr": 0.030472526026726496,
"acc_norm": 0.6530612244897959,
"acc_norm_stderr": 0.030472526026726496
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6716417910447762,
"acc_stderr": 0.033206858897443244,
"acc_norm": 0.6716417910447762,
"acc_norm_stderr": 0.033206858897443244
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4879518072289157,
"acc_stderr": 0.03891364495835821,
"acc_norm": 0.4879518072289157,
"acc_norm_stderr": 0.03891364495835821
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3806609547123623,
"mc1_stderr": 0.01699762787190793,
"mc2": 0.5451670851918026,
"mc2_stderr": 0.01582581744184166
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_PulsarAI__2x-LoRA-Assemble-Nova-13B | 2023-10-08T14:52:33.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of PulsarAI/2x-LoRA-Assemble-Nova-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [PulsarAI/2x-LoRA-Assemble-Nova-13B](https://huggingface.co/PulsarAI/2x-LoRA-Assemble-Nova-13B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PulsarAI__2x-LoRA-Assemble-Nova-13B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-08T14:51:09.823341](https://huggingface.co/datasets/open-llm-leaderboard/details_PulsarAI__2x-LoRA-Assemble-Nova-13B/blob/main/results_2023-10-08T14-51-09.823341.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5871990415581504,\n\
\ \"acc_stderr\": 0.034205209769974654,\n \"acc_norm\": 0.5912493323556589,\n\
\ \"acc_norm_stderr\": 0.034182563847539614,\n \"mc1\": 0.3574051407588739,\n\
\ \"mc1_stderr\": 0.0167765996767294,\n \"mc2\": 0.5188449816339029,\n\
\ \"mc2_stderr\": 0.015388101762026653\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5861774744027304,\n \"acc_stderr\": 0.014392730009221009,\n\
\ \"acc_norm\": 0.6262798634812287,\n \"acc_norm_stderr\": 0.01413770860175909\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6335391356303525,\n\
\ \"acc_stderr\": 0.004808526802718585,\n \"acc_norm\": 0.8324039036048596,\n\
\ \"acc_norm_stderr\": 0.0037274387865133944\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5111111111111111,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.5111111111111111,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5789473684210527,\n \"acc_stderr\": 0.04017901275981749,\n\
\ \"acc_norm\": 0.5789473684210527,\n \"acc_norm_stderr\": 0.04017901275981749\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6075471698113207,\n \"acc_stderr\": 0.030052580579557845,\n\
\ \"acc_norm\": 0.6075471698113207,\n \"acc_norm_stderr\": 0.030052580579557845\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.625,\n\
\ \"acc_stderr\": 0.04048439222695598,\n \"acc_norm\": 0.625,\n \
\ \"acc_norm_stderr\": 0.04048439222695598\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5722543352601156,\n\
\ \"acc_stderr\": 0.03772446857518027,\n \"acc_norm\": 0.5722543352601156,\n\
\ \"acc_norm_stderr\": 0.03772446857518027\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\": 0.67,\n\
\ \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4765957446808511,\n \"acc_stderr\": 0.032650194750335815,\n\
\ \"acc_norm\": 0.4765957446808511,\n \"acc_norm_stderr\": 0.032650194750335815\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n\
\ \"acc_stderr\": 0.043391383225798615,\n \"acc_norm\": 0.30701754385964913,\n\
\ \"acc_norm_stderr\": 0.043391383225798615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728762,\n\
\ \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728762\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3333333333333333,\n \"acc_stderr\": 0.0242785680243077,\n \"acc_norm\"\
: 0.3333333333333333,\n \"acc_norm_stderr\": 0.0242785680243077\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677173,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677173\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6612903225806451,\n\
\ \"acc_stderr\": 0.02692344605930284,\n \"acc_norm\": 0.6612903225806451,\n\
\ \"acc_norm_stderr\": 0.02692344605930284\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4236453201970443,\n \"acc_stderr\": 0.03476725747649038,\n\
\ \"acc_norm\": 0.4236453201970443,\n \"acc_norm_stderr\": 0.03476725747649038\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\"\
: 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.03453131801885415,\n\
\ \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.03453131801885415\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7727272727272727,\n \"acc_stderr\": 0.02985751567338642,\n \"\
acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.02985751567338642\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.844559585492228,\n \"acc_stderr\": 0.02614848346915331,\n\
\ \"acc_norm\": 0.844559585492228,\n \"acc_norm_stderr\": 0.02614848346915331\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6256410256410256,\n \"acc_stderr\": 0.024537591572830503,\n\
\ \"acc_norm\": 0.6256410256410256,\n \"acc_norm_stderr\": 0.024537591572830503\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253252,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253252\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \
\ \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8036697247706422,\n \"acc_stderr\": 0.017030719339154336,\n \"\
acc_norm\": 0.8036697247706422,\n \"acc_norm_stderr\": 0.017030719339154336\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5648148148148148,\n \"acc_stderr\": 0.03381200005643526,\n \"\
acc_norm\": 0.5648148148148148,\n \"acc_norm_stderr\": 0.03381200005643526\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8235294117647058,\n \"acc_stderr\": 0.026756401538078962,\n \"\
acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.026756401538078962\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7848101265822784,\n \"acc_stderr\": 0.026750826994676177,\n \
\ \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.026750826994676177\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n\
\ \"acc_stderr\": 0.031493846709941306,\n \"acc_norm\": 0.672645739910314,\n\
\ \"acc_norm_stderr\": 0.031493846709941306\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6564885496183206,\n \"acc_stderr\": 0.041649760719448786,\n\
\ \"acc_norm\": 0.6564885496183206,\n \"acc_norm_stderr\": 0.041649760719448786\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6859504132231405,\n \"acc_stderr\": 0.042369647530410184,\n \"\
acc_norm\": 0.6859504132231405,\n \"acc_norm_stderr\": 0.042369647530410184\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.04330043749650741,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.04330043749650741\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7116564417177914,\n \"acc_stderr\": 0.03559039531617342,\n\
\ \"acc_norm\": 0.7116564417177914,\n \"acc_norm_stderr\": 0.03559039531617342\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n\
\ \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n\
\ \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8247863247863247,\n\
\ \"acc_stderr\": 0.024904439098918228,\n \"acc_norm\": 0.8247863247863247,\n\
\ \"acc_norm_stderr\": 0.024904439098918228\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7624521072796935,\n\
\ \"acc_stderr\": 0.015218733046150191,\n \"acc_norm\": 0.7624521072796935,\n\
\ \"acc_norm_stderr\": 0.015218733046150191\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6676300578034682,\n \"acc_stderr\": 0.02536116874968824,\n\
\ \"acc_norm\": 0.6676300578034682,\n \"acc_norm_stderr\": 0.02536116874968824\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39776536312849164,\n\
\ \"acc_stderr\": 0.01636920497126298,\n \"acc_norm\": 0.39776536312849164,\n\
\ \"acc_norm_stderr\": 0.01636920497126298\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.027914055510468008,\n\
\ \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.027914055510468008\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6688102893890675,\n\
\ \"acc_stderr\": 0.02673062072800491,\n \"acc_norm\": 0.6688102893890675,\n\
\ \"acc_norm_stderr\": 0.02673062072800491\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6944444444444444,\n \"acc_stderr\": 0.025630824975621344,\n\
\ \"acc_norm\": 0.6944444444444444,\n \"acc_norm_stderr\": 0.025630824975621344\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46099290780141844,\n \"acc_stderr\": 0.029736592526424434,\n \
\ \"acc_norm\": 0.46099290780141844,\n \"acc_norm_stderr\": 0.029736592526424434\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44328552803129073,\n\
\ \"acc_stderr\": 0.01268781841959992,\n \"acc_norm\": 0.44328552803129073,\n\
\ \"acc_norm_stderr\": 0.01268781841959992\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.029520095697687765,\n\
\ \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.029520095697687765\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5718954248366013,\n \"acc_stderr\": 0.0200176292142131,\n \
\ \"acc_norm\": 0.5718954248366013,\n \"acc_norm_stderr\": 0.0200176292142131\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6081632653061224,\n \"acc_stderr\": 0.031251275910891656,\n\
\ \"acc_norm\": 0.6081632653061224,\n \"acc_norm_stderr\": 0.031251275910891656\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7263681592039801,\n\
\ \"acc_stderr\": 0.031524391865554016,\n \"acc_norm\": 0.7263681592039801,\n\
\ \"acc_norm_stderr\": 0.031524391865554016\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.46987951807228917,\n\
\ \"acc_stderr\": 0.03885425420866766,\n \"acc_norm\": 0.46987951807228917,\n\
\ \"acc_norm_stderr\": 0.03885425420866766\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7719298245614035,\n \"acc_stderr\": 0.032180937956023566,\n\
\ \"acc_norm\": 0.7719298245614035,\n \"acc_norm_stderr\": 0.032180937956023566\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3574051407588739,\n\
\ \"mc1_stderr\": 0.0167765996767294,\n \"mc2\": 0.5188449816339029,\n\
\ \"mc2_stderr\": 0.015388101762026653\n }\n}\n```"
repo_url: https://huggingface.co/PulsarAI/2x-LoRA-Assemble-Nova-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_08T14_51_09.823341
path:
- '**/details_harness|arc:challenge|25_2023-10-08T14-51-09.823341.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-08T14-51-09.823341.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_08T14_51_09.823341
path:
- '**/details_harness|hellaswag|10_2023-10-08T14-51-09.823341.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-08T14-51-09.823341.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_08T14_51_09.823341
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T14-51-09.823341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T14-51-09.823341.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T14-51-09.823341.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_08T14_51_09.823341
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T14-51-09.823341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T14-51-09.823341.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_08T14_51_09.823341
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T14-51-09.823341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T14-51-09.823341.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_08T14_51_09.823341
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T14-51-09.823341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T14-51-09.823341.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_08T14_51_09.823341
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T14-51-09.823341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T14-51-09.823341.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_08T14_51_09.823341
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T14-51-09.823341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T14-51-09.823341.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_08T14_51_09.823341
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T14-51-09.823341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T14-51-09.823341.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_08T14_51_09.823341
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T14-51-09.823341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T14-51-09.823341.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_08T14_51_09.823341
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T14-51-09.823341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T14-51-09.823341.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_08T14_51_09.823341
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T14-51-09.823341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T14-51-09.823341.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_08T14_51_09.823341
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T14-51-09.823341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T14-51-09.823341.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_08T14_51_09.823341
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T14-51-09.823341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T14-51-09.823341.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_08T14_51_09.823341
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T14-51-09.823341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T14-51-09.823341.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_08T14_51_09.823341
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T14-51-09.823341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T14-51-09.823341.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_08T14_51_09.823341
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T14-51-09.823341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T14-51-09.823341.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_08T14_51_09.823341
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T14-51-09.823341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T14-51-09.823341.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_08T14_51_09.823341
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T14-51-09.823341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T14-51-09.823341.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_08T14_51_09.823341
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T14-51-09.823341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T14-51-09.823341.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_08T14_51_09.823341
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T14-51-09.823341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T14-51-09.823341.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_08T14_51_09.823341
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T14-51-09.823341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T14-51-09.823341.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_08T14_51_09.823341
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T14-51-09.823341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T14-51-09.823341.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_08T14_51_09.823341
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T14-51-09.823341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T14-51-09.823341.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_08T14_51_09.823341
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T14-51-09.823341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T14-51-09.823341.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_08T14_51_09.823341
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T14-51-09.823341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T14-51-09.823341.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_08T14_51_09.823341
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T14-51-09.823341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T14-51-09.823341.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_08T14_51_09.823341
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T14-51-09.823341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T14-51-09.823341.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_08T14_51_09.823341
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T14-51-09.823341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T14-51-09.823341.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_08T14_51_09.823341
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T14-51-09.823341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T14-51-09.823341.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_08T14_51_09.823341
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T14-51-09.823341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T14-51-09.823341.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_08T14_51_09.823341
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T14-51-09.823341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T14-51-09.823341.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_08T14_51_09.823341
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T14-51-09.823341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T14-51-09.823341.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_08T14_51_09.823341
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T14-51-09.823341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T14-51-09.823341.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_08T14_51_09.823341
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T14-51-09.823341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T14-51-09.823341.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_08T14_51_09.823341
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T14-51-09.823341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T14-51-09.823341.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_08T14_51_09.823341
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T14-51-09.823341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T14-51-09.823341.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_08T14_51_09.823341
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T14-51-09.823341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T14-51-09.823341.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_08T14_51_09.823341
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T14-51-09.823341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T14-51-09.823341.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_08T14_51_09.823341
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T14-51-09.823341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T14-51-09.823341.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_08T14_51_09.823341
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T14-51-09.823341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T14-51-09.823341.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_08T14_51_09.823341
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-08T14-51-09.823341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-08T14-51-09.823341.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_08T14_51_09.823341
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T14-51-09.823341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T14-51-09.823341.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_08T14_51_09.823341
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T14-51-09.823341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T14-51-09.823341.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_08T14_51_09.823341
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T14-51-09.823341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T14-51-09.823341.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_08T14_51_09.823341
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T14-51-09.823341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T14-51-09.823341.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_08T14_51_09.823341
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T14-51-09.823341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T14-51-09.823341.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_08T14_51_09.823341
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T14-51-09.823341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T14-51-09.823341.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_08T14_51_09.823341
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T14-51-09.823341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T14-51-09.823341.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_08T14_51_09.823341
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T14-51-09.823341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T14-51-09.823341.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_08T14_51_09.823341
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T14-51-09.823341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T14-51-09.823341.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_08T14_51_09.823341
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T14-51-09.823341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T14-51-09.823341.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_08T14_51_09.823341
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T14-51-09.823341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T14-51-09.823341.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_08T14_51_09.823341
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T14-51-09.823341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T14-51-09.823341.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_08T14_51_09.823341
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T14-51-09.823341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T14-51-09.823341.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_08T14_51_09.823341
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T14-51-09.823341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T14-51-09.823341.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_08T14_51_09.823341
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T14-51-09.823341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T14-51-09.823341.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_08T14_51_09.823341
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T14-51-09.823341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T14-51-09.823341.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_08T14_51_09.823341
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T14-51-09.823341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T14-51-09.823341.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_08T14_51_09.823341
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T14-51-09.823341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T14-51-09.823341.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_08T14_51_09.823341
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-08T14-51-09.823341.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-08T14-51-09.823341.parquet'
- config_name: results
data_files:
- split: 2023_10_08T14_51_09.823341
path:
- results_2023-10-08T14-51-09.823341.parquet
- split: latest
path:
- results_2023-10-08T14-51-09.823341.parquet
---
# Dataset Card for Evaluation run of PulsarAI/2x-LoRA-Assemble-Nova-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/PulsarAI/2x-LoRA-Assemble-Nova-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [PulsarAI/2x-LoRA-Assemble-Nova-13B](https://huggingface.co/PulsarAI/2x-LoRA-Assemble-Nova-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_PulsarAI__2x-LoRA-Assemble-Nova-13B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-08T14:51:09.823341](https://huggingface.co/datasets/open-llm-leaderboard/details_PulsarAI__2x-LoRA-Assemble-Nova-13B/blob/main/results_2023-10-08T14-51-09.823341.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5871990415581504,
"acc_stderr": 0.034205209769974654,
"acc_norm": 0.5912493323556589,
"acc_norm_stderr": 0.034182563847539614,
"mc1": 0.3574051407588739,
"mc1_stderr": 0.0167765996767294,
"mc2": 0.5188449816339029,
"mc2_stderr": 0.015388101762026653
},
"harness|arc:challenge|25": {
"acc": 0.5861774744027304,
"acc_stderr": 0.014392730009221009,
"acc_norm": 0.6262798634812287,
"acc_norm_stderr": 0.01413770860175909
},
"harness|hellaswag|10": {
"acc": 0.6335391356303525,
"acc_stderr": 0.004808526802718585,
"acc_norm": 0.8324039036048596,
"acc_norm_stderr": 0.0037274387865133944
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5111111111111111,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.5111111111111111,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5789473684210527,
"acc_stderr": 0.04017901275981749,
"acc_norm": 0.5789473684210527,
"acc_norm_stderr": 0.04017901275981749
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6075471698113207,
"acc_stderr": 0.030052580579557845,
"acc_norm": 0.6075471698113207,
"acc_norm_stderr": 0.030052580579557845
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.625,
"acc_stderr": 0.04048439222695598,
"acc_norm": 0.625,
"acc_norm_stderr": 0.04048439222695598
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5722543352601156,
"acc_stderr": 0.03772446857518027,
"acc_norm": 0.5722543352601156,
"acc_norm_stderr": 0.03772446857518027
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4765957446808511,
"acc_stderr": 0.032650194750335815,
"acc_norm": 0.4765957446808511,
"acc_norm_stderr": 0.032650194750335815
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.043391383225798615,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.043391383225798615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728762,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728762
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.0242785680243077,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.0242785680243077
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677173,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677173
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6612903225806451,
"acc_stderr": 0.02692344605930284,
"acc_norm": 0.6612903225806451,
"acc_norm_stderr": 0.02692344605930284
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4236453201970443,
"acc_stderr": 0.03476725747649038,
"acc_norm": 0.4236453201970443,
"acc_norm_stderr": 0.03476725747649038
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.03453131801885415,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.03453131801885415
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.02985751567338642,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.02985751567338642
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.844559585492228,
"acc_stderr": 0.02614848346915331,
"acc_norm": 0.844559585492228,
"acc_norm_stderr": 0.02614848346915331
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6256410256410256,
"acc_stderr": 0.024537591572830503,
"acc_norm": 0.6256410256410256,
"acc_norm_stderr": 0.024537591572830503
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253252,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253252
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8036697247706422,
"acc_stderr": 0.017030719339154336,
"acc_norm": 0.8036697247706422,
"acc_norm_stderr": 0.017030719339154336
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5648148148148148,
"acc_stderr": 0.03381200005643526,
"acc_norm": 0.5648148148148148,
"acc_norm_stderr": 0.03381200005643526
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.026756401538078962,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.026756401538078962
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.026750826994676177,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.026750826994676177
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.031493846709941306,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.031493846709941306
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6564885496183206,
"acc_stderr": 0.041649760719448786,
"acc_norm": 0.6564885496183206,
"acc_norm_stderr": 0.041649760719448786
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6859504132231405,
"acc_stderr": 0.042369647530410184,
"acc_norm": 0.6859504132231405,
"acc_norm_stderr": 0.042369647530410184
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.04330043749650741,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.04330043749650741
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7116564417177914,
"acc_stderr": 0.03559039531617342,
"acc_norm": 0.7116564417177914,
"acc_norm_stderr": 0.03559039531617342
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8247863247863247,
"acc_stderr": 0.024904439098918228,
"acc_norm": 0.8247863247863247,
"acc_norm_stderr": 0.024904439098918228
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7624521072796935,
"acc_stderr": 0.015218733046150191,
"acc_norm": 0.7624521072796935,
"acc_norm_stderr": 0.015218733046150191
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6676300578034682,
"acc_stderr": 0.02536116874968824,
"acc_norm": 0.6676300578034682,
"acc_norm_stderr": 0.02536116874968824
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.39776536312849164,
"acc_stderr": 0.01636920497126298,
"acc_norm": 0.39776536312849164,
"acc_norm_stderr": 0.01636920497126298
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.027914055510468008,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.027914055510468008
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6688102893890675,
"acc_stderr": 0.02673062072800491,
"acc_norm": 0.6688102893890675,
"acc_norm_stderr": 0.02673062072800491
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.025630824975621344,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.025630824975621344
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46099290780141844,
"acc_stderr": 0.029736592526424434,
"acc_norm": 0.46099290780141844,
"acc_norm_stderr": 0.029736592526424434
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44328552803129073,
"acc_stderr": 0.01268781841959992,
"acc_norm": 0.44328552803129073,
"acc_norm_stderr": 0.01268781841959992
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.029520095697687765,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.029520095697687765
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5718954248366013,
"acc_stderr": 0.0200176292142131,
"acc_norm": 0.5718954248366013,
"acc_norm_stderr": 0.0200176292142131
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6081632653061224,
"acc_stderr": 0.031251275910891656,
"acc_norm": 0.6081632653061224,
"acc_norm_stderr": 0.031251275910891656
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7263681592039801,
"acc_stderr": 0.031524391865554016,
"acc_norm": 0.7263681592039801,
"acc_norm_stderr": 0.031524391865554016
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.46987951807228917,
"acc_stderr": 0.03885425420866766,
"acc_norm": 0.46987951807228917,
"acc_norm_stderr": 0.03885425420866766
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7719298245614035,
"acc_stderr": 0.032180937956023566,
"acc_norm": 0.7719298245614035,
"acc_norm_stderr": 0.032180937956023566
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3574051407588739,
"mc1_stderr": 0.0167765996767294,
"mc2": 0.5188449816339029,
"mc2_stderr": 0.015388101762026653
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_PulsarAI__2x-LoRA-Assemble-Platypus2-13B | 2023-10-08T14:59:57.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of PulsarAI/2x-LoRA-Assemble-Platypus2-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [PulsarAI/2x-LoRA-Assemble-Platypus2-13B](https://huggingface.co/PulsarAI/2x-LoRA-Assemble-Platypus2-13B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PulsarAI__2x-LoRA-Assemble-Platypus2-13B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-08T14:58:33.553023](https://huggingface.co/datasets/open-llm-leaderboard/details_PulsarAI__2x-LoRA-Assemble-Platypus2-13B/blob/main/results_2023-10-08T14-58-33.553023.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5831519747078309,\n\
\ \"acc_stderr\": 0.034192008793577415,\n \"acc_norm\": 0.5870238916351592,\n\
\ \"acc_norm_stderr\": 0.03417160310292123,\n \"mc1\": 0.386780905752754,\n\
\ \"mc1_stderr\": 0.017048857010515107,\n \"mc2\": 0.5477106072151011,\n\
\ \"mc2_stderr\": 0.015859389336255567\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5742320819112628,\n \"acc_stderr\": 0.014449464278868807,\n\
\ \"acc_norm\": 0.60580204778157,\n \"acc_norm_stderr\": 0.01428052266746732\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6287592113124876,\n\
\ \"acc_stderr\": 0.004821492994082128,\n \"acc_norm\": 0.8256323441545509,\n\
\ \"acc_norm_stderr\": 0.00378649885676912\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5111111111111111,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.5111111111111111,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6118421052631579,\n \"acc_stderr\": 0.03965842097512744,\n\
\ \"acc_norm\": 0.6118421052631579,\n \"acc_norm_stderr\": 0.03965842097512744\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5811320754716981,\n \"acc_stderr\": 0.030365050829115208,\n\
\ \"acc_norm\": 0.5811320754716981,\n \"acc_norm_stderr\": 0.030365050829115208\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6319444444444444,\n\
\ \"acc_stderr\": 0.04032999053960719,\n \"acc_norm\": 0.6319444444444444,\n\
\ \"acc_norm_stderr\": 0.04032999053960719\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n\
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5664739884393064,\n\
\ \"acc_stderr\": 0.03778621079092055,\n \"acc_norm\": 0.5664739884393064,\n\
\ \"acc_norm_stderr\": 0.03778621079092055\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929777,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929777\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.67,\n \"acc_stderr\": 0.047258156262526094,\n \"acc_norm\": 0.67,\n\
\ \"acc_norm_stderr\": 0.047258156262526094\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4765957446808511,\n \"acc_stderr\": 0.032650194750335815,\n\
\ \"acc_norm\": 0.4765957446808511,\n \"acc_norm_stderr\": 0.032650194750335815\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3508771929824561,\n\
\ \"acc_stderr\": 0.044895393502706986,\n \"acc_norm\": 0.3508771929824561,\n\
\ \"acc_norm_stderr\": 0.044895393502706986\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.34656084656084657,\n \"acc_stderr\": 0.02450877752102842,\n \"\
acc_norm\": 0.34656084656084657,\n \"acc_norm_stderr\": 0.02450877752102842\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6258064516129033,\n\
\ \"acc_stderr\": 0.027528904299845697,\n \"acc_norm\": 0.6258064516129033,\n\
\ \"acc_norm_stderr\": 0.027528904299845697\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.39408866995073893,\n \"acc_stderr\": 0.03438157967036545,\n\
\ \"acc_norm\": 0.39408866995073893,\n \"acc_norm_stderr\": 0.03438157967036545\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237101,\n \"acc_norm\"\
: 0.59,\n \"acc_norm_stderr\": 0.04943110704237101\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7393939393939394,\n \"acc_stderr\": 0.03427743175816524,\n\
\ \"acc_norm\": 0.7393939393939394,\n \"acc_norm_stderr\": 0.03427743175816524\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7424242424242424,\n \"acc_stderr\": 0.03115626951964683,\n \"\
acc_norm\": 0.7424242424242424,\n \"acc_norm_stderr\": 0.03115626951964683\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.024233532297758723,\n\
\ \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.024233532297758723\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.617948717948718,\n \"acc_stderr\": 0.02463554916390823,\n \
\ \"acc_norm\": 0.617948717948718,\n \"acc_norm_stderr\": 0.02463554916390823\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948496,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948496\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5714285714285714,\n \"acc_stderr\": 0.032145368597886394,\n\
\ \"acc_norm\": 0.5714285714285714,\n \"acc_norm_stderr\": 0.032145368597886394\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.781651376146789,\n \"acc_stderr\": 0.017712600528722727,\n \"\
acc_norm\": 0.781651376146789,\n \"acc_norm_stderr\": 0.017712600528722727\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4398148148148148,\n \"acc_stderr\": 0.03385177976044812,\n \"\
acc_norm\": 0.4398148148148148,\n \"acc_norm_stderr\": 0.03385177976044812\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8186274509803921,\n \"acc_stderr\": 0.02704462171947408,\n \"\
acc_norm\": 0.8186274509803921,\n \"acc_norm_stderr\": 0.02704462171947408\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7552742616033755,\n \"acc_stderr\": 0.027985699387036416,\n \
\ \"acc_norm\": 0.7552742616033755,\n \"acc_norm_stderr\": 0.027985699387036416\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6564885496183206,\n \"acc_stderr\": 0.041649760719448786,\n\
\ \"acc_norm\": 0.6564885496183206,\n \"acc_norm_stderr\": 0.041649760719448786\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7355371900826446,\n \"acc_stderr\": 0.040261875275912073,\n \"\
acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.040261875275912073\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6932515337423313,\n \"acc_stderr\": 0.03623089915724146,\n\
\ \"acc_norm\": 0.6932515337423313,\n \"acc_norm_stderr\": 0.03623089915724146\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n\
\ \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \
\ \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.043546310772605956,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.043546310772605956\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.02441494730454368,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.02441494730454368\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7828863346104725,\n\
\ \"acc_stderr\": 0.014743125394823298,\n \"acc_norm\": 0.7828863346104725,\n\
\ \"acc_norm_stderr\": 0.014743125394823298\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.638728323699422,\n \"acc_stderr\": 0.02586220185227789,\n\
\ \"acc_norm\": 0.638728323699422,\n \"acc_norm_stderr\": 0.02586220185227789\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.48156424581005586,\n\
\ \"acc_stderr\": 0.016711130497782813,\n \"acc_norm\": 0.48156424581005586,\n\
\ \"acc_norm_stderr\": 0.016711130497782813\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6372549019607843,\n \"acc_stderr\": 0.027530078447110307,\n\
\ \"acc_norm\": 0.6372549019607843,\n \"acc_norm_stderr\": 0.027530078447110307\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6591639871382636,\n\
\ \"acc_stderr\": 0.026920841260776162,\n \"acc_norm\": 0.6591639871382636,\n\
\ \"acc_norm_stderr\": 0.026920841260776162\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6944444444444444,\n \"acc_stderr\": 0.025630824975621344,\n\
\ \"acc_norm\": 0.6944444444444444,\n \"acc_norm_stderr\": 0.025630824975621344\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.450354609929078,\n \"acc_stderr\": 0.029680105565029036,\n \
\ \"acc_norm\": 0.450354609929078,\n \"acc_norm_stderr\": 0.029680105565029036\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45241199478487615,\n\
\ \"acc_stderr\": 0.012712265105889133,\n \"acc_norm\": 0.45241199478487615,\n\
\ \"acc_norm_stderr\": 0.012712265105889133\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5992647058823529,\n \"acc_stderr\": 0.029768263528933105,\n\
\ \"acc_norm\": 0.5992647058823529,\n \"acc_norm_stderr\": 0.029768263528933105\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5931372549019608,\n \"acc_stderr\": 0.019873802005061177,\n \
\ \"acc_norm\": 0.5931372549019608,\n \"acc_norm_stderr\": 0.019873802005061177\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6204081632653061,\n \"acc_stderr\": 0.031067211262872475,\n\
\ \"acc_norm\": 0.6204081632653061,\n \"acc_norm_stderr\": 0.031067211262872475\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6965174129353234,\n\
\ \"acc_stderr\": 0.03251006816458618,\n \"acc_norm\": 0.6965174129353234,\n\
\ \"acc_norm_stderr\": 0.03251006816458618\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.040201512610368466,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.040201512610368466\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n\
\ \"acc_stderr\": 0.038899512528272166,\n \"acc_norm\": 0.4819277108433735,\n\
\ \"acc_norm_stderr\": 0.038899512528272166\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.02991312723236804,\n\
\ \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.02991312723236804\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.386780905752754,\n\
\ \"mc1_stderr\": 0.017048857010515107,\n \"mc2\": 0.5477106072151011,\n\
\ \"mc2_stderr\": 0.015859389336255567\n }\n}\n```"
repo_url: https://huggingface.co/PulsarAI/2x-LoRA-Assemble-Platypus2-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_08T14_58_33.553023
path:
- '**/details_harness|arc:challenge|25_2023-10-08T14-58-33.553023.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-08T14-58-33.553023.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_08T14_58_33.553023
path:
- '**/details_harness|hellaswag|10_2023-10-08T14-58-33.553023.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-08T14-58-33.553023.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_08T14_58_33.553023
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T14-58-33.553023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T14-58-33.553023.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T14-58-33.553023.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_08T14_58_33.553023
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T14-58-33.553023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T14-58-33.553023.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_08T14_58_33.553023
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T14-58-33.553023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T14-58-33.553023.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_08T14_58_33.553023
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T14-58-33.553023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T14-58-33.553023.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_08T14_58_33.553023
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T14-58-33.553023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T14-58-33.553023.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_08T14_58_33.553023
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T14-58-33.553023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T14-58-33.553023.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_08T14_58_33.553023
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T14-58-33.553023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T14-58-33.553023.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_08T14_58_33.553023
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T14-58-33.553023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T14-58-33.553023.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_08T14_58_33.553023
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T14-58-33.553023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T14-58-33.553023.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_08T14_58_33.553023
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T14-58-33.553023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T14-58-33.553023.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_08T14_58_33.553023
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T14-58-33.553023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T14-58-33.553023.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_08T14_58_33.553023
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T14-58-33.553023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T14-58-33.553023.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_08T14_58_33.553023
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T14-58-33.553023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T14-58-33.553023.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_08T14_58_33.553023
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T14-58-33.553023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T14-58-33.553023.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_08T14_58_33.553023
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T14-58-33.553023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T14-58-33.553023.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_08T14_58_33.553023
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T14-58-33.553023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T14-58-33.553023.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_08T14_58_33.553023
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T14-58-33.553023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T14-58-33.553023.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_08T14_58_33.553023
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T14-58-33.553023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T14-58-33.553023.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_08T14_58_33.553023
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T14-58-33.553023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T14-58-33.553023.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_08T14_58_33.553023
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T14-58-33.553023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T14-58-33.553023.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_08T14_58_33.553023
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T14-58-33.553023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T14-58-33.553023.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_08T14_58_33.553023
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T14-58-33.553023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T14-58-33.553023.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_08T14_58_33.553023
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T14-58-33.553023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T14-58-33.553023.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_08T14_58_33.553023
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T14-58-33.553023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T14-58-33.553023.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_08T14_58_33.553023
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T14-58-33.553023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T14-58-33.553023.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_08T14_58_33.553023
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T14-58-33.553023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T14-58-33.553023.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_08T14_58_33.553023
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T14-58-33.553023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T14-58-33.553023.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_08T14_58_33.553023
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T14-58-33.553023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T14-58-33.553023.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_08T14_58_33.553023
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T14-58-33.553023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T14-58-33.553023.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_08T14_58_33.553023
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T14-58-33.553023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T14-58-33.553023.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_08T14_58_33.553023
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T14-58-33.553023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T14-58-33.553023.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_08T14_58_33.553023
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T14-58-33.553023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T14-58-33.553023.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_08T14_58_33.553023
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T14-58-33.553023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T14-58-33.553023.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_08T14_58_33.553023
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T14-58-33.553023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T14-58-33.553023.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_08T14_58_33.553023
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T14-58-33.553023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T14-58-33.553023.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_08T14_58_33.553023
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T14-58-33.553023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T14-58-33.553023.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_08T14_58_33.553023
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T14-58-33.553023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T14-58-33.553023.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_08T14_58_33.553023
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T14-58-33.553023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T14-58-33.553023.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_08T14_58_33.553023
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T14-58-33.553023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T14-58-33.553023.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_08T14_58_33.553023
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-08T14-58-33.553023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-08T14-58-33.553023.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_08T14_58_33.553023
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T14-58-33.553023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T14-58-33.553023.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_08T14_58_33.553023
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T14-58-33.553023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T14-58-33.553023.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_08T14_58_33.553023
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T14-58-33.553023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T14-58-33.553023.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_08T14_58_33.553023
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T14-58-33.553023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T14-58-33.553023.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_08T14_58_33.553023
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T14-58-33.553023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T14-58-33.553023.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_08T14_58_33.553023
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T14-58-33.553023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T14-58-33.553023.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_08T14_58_33.553023
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T14-58-33.553023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T14-58-33.553023.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_08T14_58_33.553023
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T14-58-33.553023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T14-58-33.553023.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_08T14_58_33.553023
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T14-58-33.553023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T14-58-33.553023.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_08T14_58_33.553023
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T14-58-33.553023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T14-58-33.553023.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_08T14_58_33.553023
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T14-58-33.553023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T14-58-33.553023.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_08T14_58_33.553023
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T14-58-33.553023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T14-58-33.553023.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_08T14_58_33.553023
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T14-58-33.553023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T14-58-33.553023.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_08T14_58_33.553023
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T14-58-33.553023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T14-58-33.553023.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_08T14_58_33.553023
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T14-58-33.553023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T14-58-33.553023.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_08T14_58_33.553023
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T14-58-33.553023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T14-58-33.553023.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_08T14_58_33.553023
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T14-58-33.553023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T14-58-33.553023.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_08T14_58_33.553023
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T14-58-33.553023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T14-58-33.553023.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_08T14_58_33.553023
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-08T14-58-33.553023.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-08T14-58-33.553023.parquet'
- config_name: results
data_files:
- split: 2023_10_08T14_58_33.553023
path:
- results_2023-10-08T14-58-33.553023.parquet
- split: latest
path:
- results_2023-10-08T14-58-33.553023.parquet
---
# Dataset Card for Evaluation run of PulsarAI/2x-LoRA-Assemble-Platypus2-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/PulsarAI/2x-LoRA-Assemble-Platypus2-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [PulsarAI/2x-LoRA-Assemble-Platypus2-13B](https://huggingface.co/PulsarAI/2x-LoRA-Assemble-Platypus2-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_PulsarAI__2x-LoRA-Assemble-Platypus2-13B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-08T14:58:33.553023](https://huggingface.co/datasets/open-llm-leaderboard/details_PulsarAI__2x-LoRA-Assemble-Platypus2-13B/blob/main/results_2023-10-08T14-58-33.553023.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5831519747078309,
"acc_stderr": 0.034192008793577415,
"acc_norm": 0.5870238916351592,
"acc_norm_stderr": 0.03417160310292123,
"mc1": 0.386780905752754,
"mc1_stderr": 0.017048857010515107,
"mc2": 0.5477106072151011,
"mc2_stderr": 0.015859389336255567
},
"harness|arc:challenge|25": {
"acc": 0.5742320819112628,
"acc_stderr": 0.014449464278868807,
"acc_norm": 0.60580204778157,
"acc_norm_stderr": 0.01428052266746732
},
"harness|hellaswag|10": {
"acc": 0.6287592113124876,
"acc_stderr": 0.004821492994082128,
"acc_norm": 0.8256323441545509,
"acc_norm_stderr": 0.00378649885676912
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5111111111111111,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.5111111111111111,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6118421052631579,
"acc_stderr": 0.03965842097512744,
"acc_norm": 0.6118421052631579,
"acc_norm_stderr": 0.03965842097512744
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5811320754716981,
"acc_stderr": 0.030365050829115208,
"acc_norm": 0.5811320754716981,
"acc_norm_stderr": 0.030365050829115208
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6319444444444444,
"acc_stderr": 0.04032999053960719,
"acc_norm": 0.6319444444444444,
"acc_norm_stderr": 0.04032999053960719
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5664739884393064,
"acc_stderr": 0.03778621079092055,
"acc_norm": 0.5664739884393064,
"acc_norm_stderr": 0.03778621079092055
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.04533838195929777,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.04533838195929777
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526094,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526094
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4765957446808511,
"acc_stderr": 0.032650194750335815,
"acc_norm": 0.4765957446808511,
"acc_norm_stderr": 0.032650194750335815
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3508771929824561,
"acc_stderr": 0.044895393502706986,
"acc_norm": 0.3508771929824561,
"acc_norm_stderr": 0.044895393502706986
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.34656084656084657,
"acc_stderr": 0.02450877752102842,
"acc_norm": 0.34656084656084657,
"acc_norm_stderr": 0.02450877752102842
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6258064516129033,
"acc_stderr": 0.027528904299845697,
"acc_norm": 0.6258064516129033,
"acc_norm_stderr": 0.027528904299845697
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.39408866995073893,
"acc_stderr": 0.03438157967036545,
"acc_norm": 0.39408866995073893,
"acc_norm_stderr": 0.03438157967036545
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237101,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237101
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7393939393939394,
"acc_stderr": 0.03427743175816524,
"acc_norm": 0.7393939393939394,
"acc_norm_stderr": 0.03427743175816524
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7424242424242424,
"acc_stderr": 0.03115626951964683,
"acc_norm": 0.7424242424242424,
"acc_norm_stderr": 0.03115626951964683
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8704663212435233,
"acc_stderr": 0.024233532297758723,
"acc_norm": 0.8704663212435233,
"acc_norm_stderr": 0.024233532297758723
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.617948717948718,
"acc_stderr": 0.02463554916390823,
"acc_norm": 0.617948717948718,
"acc_norm_stderr": 0.02463554916390823
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948496,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948496
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.032145368597886394,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.032145368597886394
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.781651376146789,
"acc_stderr": 0.017712600528722727,
"acc_norm": 0.781651376146789,
"acc_norm_stderr": 0.017712600528722727
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4398148148148148,
"acc_stderr": 0.03385177976044812,
"acc_norm": 0.4398148148148148,
"acc_norm_stderr": 0.03385177976044812
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8186274509803921,
"acc_stderr": 0.02704462171947408,
"acc_norm": 0.8186274509803921,
"acc_norm_stderr": 0.02704462171947408
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7552742616033755,
"acc_stderr": 0.027985699387036416,
"acc_norm": 0.7552742616033755,
"acc_norm_stderr": 0.027985699387036416
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.03138147637575499,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.03138147637575499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6564885496183206,
"acc_stderr": 0.041649760719448786,
"acc_norm": 0.6564885496183206,
"acc_norm_stderr": 0.041649760719448786
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.040261875275912073,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.040261875275912073
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6932515337423313,
"acc_stderr": 0.03623089915724146,
"acc_norm": 0.6932515337423313,
"acc_norm_stderr": 0.03623089915724146
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.375,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.375,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.043546310772605956,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.043546310772605956
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.02441494730454368,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.02441494730454368
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7828863346104725,
"acc_stderr": 0.014743125394823298,
"acc_norm": 0.7828863346104725,
"acc_norm_stderr": 0.014743125394823298
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.638728323699422,
"acc_stderr": 0.02586220185227789,
"acc_norm": 0.638728323699422,
"acc_norm_stderr": 0.02586220185227789
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.48156424581005586,
"acc_stderr": 0.016711130497782813,
"acc_norm": 0.48156424581005586,
"acc_norm_stderr": 0.016711130497782813
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6372549019607843,
"acc_stderr": 0.027530078447110307,
"acc_norm": 0.6372549019607843,
"acc_norm_stderr": 0.027530078447110307
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6591639871382636,
"acc_stderr": 0.026920841260776162,
"acc_norm": 0.6591639871382636,
"acc_norm_stderr": 0.026920841260776162
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.025630824975621344,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.025630824975621344
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.450354609929078,
"acc_stderr": 0.029680105565029036,
"acc_norm": 0.450354609929078,
"acc_norm_stderr": 0.029680105565029036
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45241199478487615,
"acc_stderr": 0.012712265105889133,
"acc_norm": 0.45241199478487615,
"acc_norm_stderr": 0.012712265105889133
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5992647058823529,
"acc_stderr": 0.029768263528933105,
"acc_norm": 0.5992647058823529,
"acc_norm_stderr": 0.029768263528933105
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5931372549019608,
"acc_stderr": 0.019873802005061177,
"acc_norm": 0.5931372549019608,
"acc_norm_stderr": 0.019873802005061177
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.04607582090719976,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.04607582090719976
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6204081632653061,
"acc_stderr": 0.031067211262872475,
"acc_norm": 0.6204081632653061,
"acc_norm_stderr": 0.031067211262872475
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6965174129353234,
"acc_stderr": 0.03251006816458618,
"acc_norm": 0.6965174129353234,
"acc_norm_stderr": 0.03251006816458618
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.8,
"acc_stderr": 0.040201512610368466,
"acc_norm": 0.8,
"acc_norm_stderr": 0.040201512610368466
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4819277108433735,
"acc_stderr": 0.038899512528272166,
"acc_norm": 0.4819277108433735,
"acc_norm_stderr": 0.038899512528272166
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.02991312723236804,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.02991312723236804
},
"harness|truthfulqa:mc|0": {
"mc1": 0.386780905752754,
"mc1_stderr": 0.017048857010515107,
"mc2": 0.5477106072151011,
"mc2_stderr": 0.015859389336255567
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Hxfi/RAVI-AYANOLI | 2023-10-08T15:01:42.000Z | [
"region:us"
] | Hxfi | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_PulsarAI__GenAI-Nova-13B | 2023-10-08T15:06:43.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of PulsarAI/GenAI-Nova-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [PulsarAI/GenAI-Nova-13B](https://huggingface.co/PulsarAI/GenAI-Nova-13B) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PulsarAI__GenAI-Nova-13B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-08T15:05:19.512883](https://huggingface.co/datasets/open-llm-leaderboard/details_PulsarAI__GenAI-Nova-13B/blob/main/results_2023-10-08T15-05-19.512883.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5950478518617498,\n\
\ \"acc_stderr\": 0.03403770527311197,\n \"acc_norm\": 0.5992203574188173,\n\
\ \"acc_norm_stderr\": 0.03401466396729267,\n \"mc1\": 0.3623011015911873,\n\
\ \"mc1_stderr\": 0.016826646897262255,\n \"mc2\": 0.5179019852720109,\n\
\ \"mc2_stderr\": 0.015198745057161709\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5776450511945392,\n \"acc_stderr\": 0.014434138713379981,\n\
\ \"acc_norm\": 0.6228668941979523,\n \"acc_norm_stderr\": 0.0141633668961926\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6317466640111532,\n\
\ \"acc_stderr\": 0.0048134486154044346,\n \"acc_norm\": 0.8327026488747261,\n\
\ \"acc_norm_stderr\": 0.003724783389253327\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4888888888888889,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.4888888888888889,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6118421052631579,\n \"acc_stderr\": 0.03965842097512744,\n\
\ \"acc_norm\": 0.6118421052631579,\n \"acc_norm_stderr\": 0.03965842097512744\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6113207547169811,\n \"acc_stderr\": 0.030000485448675986,\n\
\ \"acc_norm\": 0.6113207547169811,\n \"acc_norm_stderr\": 0.030000485448675986\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.03942082639927213,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.03942082639927213\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n\
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5895953757225434,\n\
\ \"acc_stderr\": 0.03750757044895537,\n \"acc_norm\": 0.5895953757225434,\n\
\ \"acc_norm_stderr\": 0.03750757044895537\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082635,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082635\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4808510638297872,\n \"acc_stderr\": 0.03266204299064678,\n\
\ \"acc_norm\": 0.4808510638297872,\n \"acc_norm_stderr\": 0.03266204299064678\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n\
\ \"acc_stderr\": 0.043036840335373146,\n \"acc_norm\": 0.2982456140350877,\n\
\ \"acc_norm_stderr\": 0.043036840335373146\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.041641887201693775,\n\
\ \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.041641887201693775\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3492063492063492,\n \"acc_stderr\": 0.024552292209342654,\n \"\
acc_norm\": 0.3492063492063492,\n \"acc_norm_stderr\": 0.024552292209342654\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n\
\ \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n\
\ \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6806451612903226,\n\
\ \"acc_stderr\": 0.02652270967466777,\n \"acc_norm\": 0.6806451612903226,\n\
\ \"acc_norm_stderr\": 0.02652270967466777\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n\
\ \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\"\
: 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.0347769116216366,\n\
\ \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.0347769116216366\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7777777777777778,\n \"acc_stderr\": 0.029620227874790482,\n \"\
acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.029620227874790482\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8134715025906736,\n \"acc_stderr\": 0.028112091210117467,\n\
\ \"acc_norm\": 0.8134715025906736,\n \"acc_norm_stderr\": 0.028112091210117467\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6256410256410256,\n \"acc_stderr\": 0.024537591572830506,\n\
\ \"acc_norm\": 0.6256410256410256,\n \"acc_norm_stderr\": 0.024537591572830506\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253255,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6554621848739496,\n \"acc_stderr\": 0.030868682604121626,\n\
\ \"acc_norm\": 0.6554621848739496,\n \"acc_norm_stderr\": 0.030868682604121626\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526733,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526733\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8018348623853211,\n \"acc_stderr\": 0.01709057380421791,\n \"\
acc_norm\": 0.8018348623853211,\n \"acc_norm_stderr\": 0.01709057380421791\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5138888888888888,\n \"acc_stderr\": 0.034086558679777494,\n \"\
acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.034086558679777494\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8235294117647058,\n \"acc_stderr\": 0.026756401538078962,\n \"\
acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.026756401538078962\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229966,\n \
\ \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229966\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057222,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057222\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6564885496183206,\n \"acc_stderr\": 0.041649760719448786,\n\
\ \"acc_norm\": 0.6564885496183206,\n \"acc_norm_stderr\": 0.041649760719448786\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.71900826446281,\n \"acc_stderr\": 0.04103203830514512,\n \"acc_norm\"\
: 0.71900826446281,\n \"acc_norm_stderr\": 0.04103203830514512\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7239263803680982,\n \"acc_stderr\": 0.035123852837050475,\n\
\ \"acc_norm\": 0.7239263803680982,\n \"acc_norm_stderr\": 0.035123852837050475\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04547960999764376,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04547960999764376\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8376068376068376,\n\
\ \"acc_stderr\": 0.02416161812798774,\n \"acc_norm\": 0.8376068376068376,\n\
\ \"acc_norm_stderr\": 0.02416161812798774\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7867177522349936,\n\
\ \"acc_stderr\": 0.014648172749593518,\n \"acc_norm\": 0.7867177522349936,\n\
\ \"acc_norm_stderr\": 0.014648172749593518\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6560693641618497,\n \"acc_stderr\": 0.02557412378654667,\n\
\ \"acc_norm\": 0.6560693641618497,\n \"acc_norm_stderr\": 0.02557412378654667\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4558659217877095,\n\
\ \"acc_stderr\": 0.01665722942458631,\n \"acc_norm\": 0.4558659217877095,\n\
\ \"acc_norm_stderr\": 0.01665722942458631\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6241830065359477,\n \"acc_stderr\": 0.027732834353363947,\n\
\ \"acc_norm\": 0.6241830065359477,\n \"acc_norm_stderr\": 0.027732834353363947\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6720257234726688,\n\
\ \"acc_stderr\": 0.026664410886937617,\n \"acc_norm\": 0.6720257234726688,\n\
\ \"acc_norm_stderr\": 0.026664410886937617\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6882716049382716,\n \"acc_stderr\": 0.025773111169630433,\n\
\ \"acc_norm\": 0.6882716049382716,\n \"acc_norm_stderr\": 0.025773111169630433\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4645390070921986,\n \"acc_stderr\": 0.029752389657427047,\n \
\ \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.029752389657427047\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4498044328552803,\n\
\ \"acc_stderr\": 0.012705721498565106,\n \"acc_norm\": 0.4498044328552803,\n\
\ \"acc_norm_stderr\": 0.012705721498565106\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6102941176470589,\n \"acc_stderr\": 0.0296246635811597,\n\
\ \"acc_norm\": 0.6102941176470589,\n \"acc_norm_stderr\": 0.0296246635811597\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5915032679738562,\n \"acc_stderr\": 0.019886221037501862,\n \
\ \"acc_norm\": 0.5915032679738562,\n \"acc_norm_stderr\": 0.019886221037501862\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6163265306122448,\n \"acc_stderr\": 0.031130880396235936,\n\
\ \"acc_norm\": 0.6163265306122448,\n \"acc_norm_stderr\": 0.031130880396235936\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.736318407960199,\n\
\ \"acc_stderr\": 0.03115715086935557,\n \"acc_norm\": 0.736318407960199,\n\
\ \"acc_norm_stderr\": 0.03115715086935557\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n\
\ \"acc_stderr\": 0.03889951252827217,\n \"acc_norm\": 0.4819277108433735,\n\
\ \"acc_norm_stderr\": 0.03889951252827217\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7660818713450293,\n \"acc_stderr\": 0.03246721765117825,\n\
\ \"acc_norm\": 0.7660818713450293,\n \"acc_norm_stderr\": 0.03246721765117825\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3623011015911873,\n\
\ \"mc1_stderr\": 0.016826646897262255,\n \"mc2\": 0.5179019852720109,\n\
\ \"mc2_stderr\": 0.015198745057161709\n }\n}\n```"
repo_url: https://huggingface.co/PulsarAI/GenAI-Nova-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_08T15_05_19.512883
path:
- '**/details_harness|arc:challenge|25_2023-10-08T15-05-19.512883.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-08T15-05-19.512883.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_08T15_05_19.512883
path:
- '**/details_harness|hellaswag|10_2023-10-08T15-05-19.512883.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-08T15-05-19.512883.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_08T15_05_19.512883
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T15-05-19.512883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T15-05-19.512883.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T15-05-19.512883.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_08T15_05_19.512883
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T15-05-19.512883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T15-05-19.512883.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_08T15_05_19.512883
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T15-05-19.512883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T15-05-19.512883.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_08T15_05_19.512883
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T15-05-19.512883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T15-05-19.512883.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_08T15_05_19.512883
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T15-05-19.512883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T15-05-19.512883.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_08T15_05_19.512883
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T15-05-19.512883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T15-05-19.512883.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_08T15_05_19.512883
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T15-05-19.512883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T15-05-19.512883.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_08T15_05_19.512883
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T15-05-19.512883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T15-05-19.512883.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_08T15_05_19.512883
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T15-05-19.512883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T15-05-19.512883.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_08T15_05_19.512883
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T15-05-19.512883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T15-05-19.512883.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_08T15_05_19.512883
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T15-05-19.512883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T15-05-19.512883.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_08T15_05_19.512883
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T15-05-19.512883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T15-05-19.512883.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_08T15_05_19.512883
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T15-05-19.512883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T15-05-19.512883.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_08T15_05_19.512883
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T15-05-19.512883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T15-05-19.512883.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_08T15_05_19.512883
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T15-05-19.512883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T15-05-19.512883.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_08T15_05_19.512883
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T15-05-19.512883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T15-05-19.512883.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_08T15_05_19.512883
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T15-05-19.512883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T15-05-19.512883.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_08T15_05_19.512883
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T15-05-19.512883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T15-05-19.512883.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_08T15_05_19.512883
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T15-05-19.512883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T15-05-19.512883.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_08T15_05_19.512883
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T15-05-19.512883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T15-05-19.512883.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_08T15_05_19.512883
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T15-05-19.512883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T15-05-19.512883.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_08T15_05_19.512883
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T15-05-19.512883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T15-05-19.512883.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_08T15_05_19.512883
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T15-05-19.512883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T15-05-19.512883.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_08T15_05_19.512883
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T15-05-19.512883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T15-05-19.512883.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_08T15_05_19.512883
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T15-05-19.512883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T15-05-19.512883.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_08T15_05_19.512883
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T15-05-19.512883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T15-05-19.512883.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_08T15_05_19.512883
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T15-05-19.512883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T15-05-19.512883.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_08T15_05_19.512883
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T15-05-19.512883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T15-05-19.512883.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_08T15_05_19.512883
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T15-05-19.512883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T15-05-19.512883.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_08T15_05_19.512883
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T15-05-19.512883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T15-05-19.512883.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_08T15_05_19.512883
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T15-05-19.512883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T15-05-19.512883.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_08T15_05_19.512883
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T15-05-19.512883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T15-05-19.512883.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_08T15_05_19.512883
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T15-05-19.512883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T15-05-19.512883.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_08T15_05_19.512883
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T15-05-19.512883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T15-05-19.512883.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_08T15_05_19.512883
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T15-05-19.512883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T15-05-19.512883.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_08T15_05_19.512883
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T15-05-19.512883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T15-05-19.512883.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_08T15_05_19.512883
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T15-05-19.512883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T15-05-19.512883.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_08T15_05_19.512883
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T15-05-19.512883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T15-05-19.512883.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_08T15_05_19.512883
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T15-05-19.512883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T15-05-19.512883.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_08T15_05_19.512883
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-08T15-05-19.512883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-08T15-05-19.512883.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_08T15_05_19.512883
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T15-05-19.512883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T15-05-19.512883.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_08T15_05_19.512883
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T15-05-19.512883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T15-05-19.512883.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_08T15_05_19.512883
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T15-05-19.512883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T15-05-19.512883.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_08T15_05_19.512883
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T15-05-19.512883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T15-05-19.512883.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_08T15_05_19.512883
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T15-05-19.512883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T15-05-19.512883.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_08T15_05_19.512883
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T15-05-19.512883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T15-05-19.512883.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_08T15_05_19.512883
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T15-05-19.512883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T15-05-19.512883.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_08T15_05_19.512883
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T15-05-19.512883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T15-05-19.512883.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_08T15_05_19.512883
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T15-05-19.512883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T15-05-19.512883.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_08T15_05_19.512883
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T15-05-19.512883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T15-05-19.512883.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_08T15_05_19.512883
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T15-05-19.512883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T15-05-19.512883.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_08T15_05_19.512883
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T15-05-19.512883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T15-05-19.512883.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_08T15_05_19.512883
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T15-05-19.512883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T15-05-19.512883.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_08T15_05_19.512883
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T15-05-19.512883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T15-05-19.512883.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_08T15_05_19.512883
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T15-05-19.512883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T15-05-19.512883.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_08T15_05_19.512883
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T15-05-19.512883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T15-05-19.512883.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_08T15_05_19.512883
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T15-05-19.512883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T15-05-19.512883.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_08T15_05_19.512883
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T15-05-19.512883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T15-05-19.512883.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_08T15_05_19.512883
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-08T15-05-19.512883.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-08T15-05-19.512883.parquet'
- config_name: results
data_files:
- split: 2023_10_08T15_05_19.512883
path:
- results_2023-10-08T15-05-19.512883.parquet
- split: latest
path:
- results_2023-10-08T15-05-19.512883.parquet
---
# Dataset Card for Evaluation run of PulsarAI/GenAI-Nova-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/PulsarAI/GenAI-Nova-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [PulsarAI/GenAI-Nova-13B](https://huggingface.co/PulsarAI/GenAI-Nova-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_PulsarAI__GenAI-Nova-13B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-08T15:05:19.512883](https://huggingface.co/datasets/open-llm-leaderboard/details_PulsarAI__GenAI-Nova-13B/blob/main/results_2023-10-08T15-05-19.512883.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5950478518617498,
"acc_stderr": 0.03403770527311197,
"acc_norm": 0.5992203574188173,
"acc_norm_stderr": 0.03401466396729267,
"mc1": 0.3623011015911873,
"mc1_stderr": 0.016826646897262255,
"mc2": 0.5179019852720109,
"mc2_stderr": 0.015198745057161709
},
"harness|arc:challenge|25": {
"acc": 0.5776450511945392,
"acc_stderr": 0.014434138713379981,
"acc_norm": 0.6228668941979523,
"acc_norm_stderr": 0.0141633668961926
},
"harness|hellaswag|10": {
"acc": 0.6317466640111532,
"acc_stderr": 0.0048134486154044346,
"acc_norm": 0.8327026488747261,
"acc_norm_stderr": 0.003724783389253327
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4888888888888889,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.4888888888888889,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6118421052631579,
"acc_stderr": 0.03965842097512744,
"acc_norm": 0.6118421052631579,
"acc_norm_stderr": 0.03965842097512744
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6113207547169811,
"acc_stderr": 0.030000485448675986,
"acc_norm": 0.6113207547169811,
"acc_norm_stderr": 0.030000485448675986
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03942082639927213,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03942082639927213
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5895953757225434,
"acc_stderr": 0.03750757044895537,
"acc_norm": 0.5895953757225434,
"acc_norm_stderr": 0.03750757044895537
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082635,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082635
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4808510638297872,
"acc_stderr": 0.03266204299064678,
"acc_norm": 0.4808510638297872,
"acc_norm_stderr": 0.03266204299064678
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.043036840335373146,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.043036840335373146
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.041641887201693775,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.041641887201693775
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3492063492063492,
"acc_stderr": 0.024552292209342654,
"acc_norm": 0.3492063492063492,
"acc_norm_stderr": 0.024552292209342654
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6806451612903226,
"acc_stderr": 0.02652270967466777,
"acc_norm": 0.6806451612903226,
"acc_norm_stderr": 0.02652270967466777
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.0347769116216366,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.0347769116216366
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.029620227874790482,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.029620227874790482
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8134715025906736,
"acc_stderr": 0.028112091210117467,
"acc_norm": 0.8134715025906736,
"acc_norm_stderr": 0.028112091210117467
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6256410256410256,
"acc_stderr": 0.024537591572830506,
"acc_norm": 0.6256410256410256,
"acc_norm_stderr": 0.024537591572830506
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253255,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6554621848739496,
"acc_stderr": 0.030868682604121626,
"acc_norm": 0.6554621848739496,
"acc_norm_stderr": 0.030868682604121626
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526733,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526733
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8018348623853211,
"acc_stderr": 0.01709057380421791,
"acc_norm": 0.8018348623853211,
"acc_norm_stderr": 0.01709057380421791
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.034086558679777494,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.034086558679777494
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.026756401538078962,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.026756401538078962
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7763713080168776,
"acc_stderr": 0.027123298205229966,
"acc_norm": 0.7763713080168776,
"acc_norm_stderr": 0.027123298205229966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057222,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057222
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6564885496183206,
"acc_stderr": 0.041649760719448786,
"acc_norm": 0.6564885496183206,
"acc_norm_stderr": 0.041649760719448786
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.71900826446281,
"acc_stderr": 0.04103203830514512,
"acc_norm": 0.71900826446281,
"acc_norm_stderr": 0.04103203830514512
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7239263803680982,
"acc_stderr": 0.035123852837050475,
"acc_norm": 0.7239263803680982,
"acc_norm_stderr": 0.035123852837050475
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04547960999764376,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04547960999764376
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8376068376068376,
"acc_stderr": 0.02416161812798774,
"acc_norm": 0.8376068376068376,
"acc_norm_stderr": 0.02416161812798774
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7867177522349936,
"acc_stderr": 0.014648172749593518,
"acc_norm": 0.7867177522349936,
"acc_norm_stderr": 0.014648172749593518
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6560693641618497,
"acc_stderr": 0.02557412378654667,
"acc_norm": 0.6560693641618497,
"acc_norm_stderr": 0.02557412378654667
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4558659217877095,
"acc_stderr": 0.01665722942458631,
"acc_norm": 0.4558659217877095,
"acc_norm_stderr": 0.01665722942458631
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6241830065359477,
"acc_stderr": 0.027732834353363947,
"acc_norm": 0.6241830065359477,
"acc_norm_stderr": 0.027732834353363947
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6720257234726688,
"acc_stderr": 0.026664410886937617,
"acc_norm": 0.6720257234726688,
"acc_norm_stderr": 0.026664410886937617
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6882716049382716,
"acc_stderr": 0.025773111169630433,
"acc_norm": 0.6882716049382716,
"acc_norm_stderr": 0.025773111169630433
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4645390070921986,
"acc_stderr": 0.029752389657427047,
"acc_norm": 0.4645390070921986,
"acc_norm_stderr": 0.029752389657427047
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4498044328552803,
"acc_stderr": 0.012705721498565106,
"acc_norm": 0.4498044328552803,
"acc_norm_stderr": 0.012705721498565106
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6102941176470589,
"acc_stderr": 0.0296246635811597,
"acc_norm": 0.6102941176470589,
"acc_norm_stderr": 0.0296246635811597
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5915032679738562,
"acc_stderr": 0.019886221037501862,
"acc_norm": 0.5915032679738562,
"acc_norm_stderr": 0.019886221037501862
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6163265306122448,
"acc_stderr": 0.031130880396235936,
"acc_norm": 0.6163265306122448,
"acc_norm_stderr": 0.031130880396235936
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.736318407960199,
"acc_stderr": 0.03115715086935557,
"acc_norm": 0.736318407960199,
"acc_norm_stderr": 0.03115715086935557
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4819277108433735,
"acc_stderr": 0.03889951252827217,
"acc_norm": 0.4819277108433735,
"acc_norm_stderr": 0.03889951252827217
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7660818713450293,
"acc_stderr": 0.03246721765117825,
"acc_norm": 0.7660818713450293,
"acc_norm_stderr": 0.03246721765117825
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3623011015911873,
"mc1_stderr": 0.016826646897262255,
"mc2": 0.5179019852720109,
"mc2_stderr": 0.015198745057161709
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
hsali/audio_dataset1 | 2023-10-08T15:09:15.000Z | [
"region:us"
] | hsali | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: file_id
dtype: string
splits:
- name: train
num_bytes: 439185.0
num_examples: 5
download_size: 427010
dataset_size: 439185.0
---
# Dataset Card for "audio_dataset1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
johannes-garstenauer/embeddings_from_distilbert_masking_heaps_and_eval_part0_test | 2023-10-08T15:31:50.000Z | [
"region:us"
] | johannes-garstenauer | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: struct
dtype: string
- name: label
dtype: int64
- name: pred
dtype: int64
- name: cls_layer_6
sequence: float32
- name: cls_layer_5
sequence: float32
- name: cls_layer_4
sequence: float32
splits:
- name: train
num_bytes: 13428556
num_examples: 1408
download_size: 16660183
dataset_size: 13428556
---
# Dataset Card for "embeddings_from_distilbert_masking_heaps_and_eval_part0_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
johannes-garstenauer/embeddings_from_distilbert_masking_heaps_and_eval_part1_test | 2023-10-08T15:32:10.000Z | [
"region:us"
] | johannes-garstenauer | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: struct
dtype: string
- name: label
dtype: int64
- name: pred
dtype: int64
- name: cls_layer_6
sequence: float32
- name: cls_layer_5
sequence: float32
- name: cls_layer_4
sequence: float32
splits:
- name: train
num_bytes: 12230881
num_examples: 1283
download_size: 14962458
dataset_size: 12230881
---
# Dataset Card for "embeddings_from_distilbert_masking_heaps_and_eval_part1_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Intuit-GenSRF/jquiros-suicide-es | 2023-10-08T15:39:21.000Z | [
"region:us"
] | Intuit-GenSRF | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
- name: labels
sequence: string
- name: processed_text
sequence: string
- name: num_tokens
dtype: int64
- name: text_es
dtype: string
splits:
- name: train
num_bytes: 434028422
num_examples: 230832
download_size: 266158998
dataset_size: 434028422
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "jquiros-suicide-es"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Eu001/testezinho | 2023-10-08T15:44:44.000Z | [
"license:openrail",
"region:us"
] | Eu001 | null | null | null | 0 | 0 | ---
license: openrail
---
|
ppxscal/embedding_network | 2023-10-09T01:21:10.000Z | [
"license:mit",
"region:us"
] | ppxscal | null | null | null | 0 | 0 | ---
license: mit
---
|
BounharAbdelaziz/Face-Aging-Dataset | 2023-10-08T16:16:52.000Z | [
"region:us"
] | BounharAbdelaziz | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': age_domain_20_35
'1': age_domain_36_60
'2': age_domain_60_90
splits:
- name: train
num_bytes: 16136815235.988
num_examples: 40252
download_size: 16202626214
dataset_size: 16136815235.988
---
# Dataset Card for "Face-Aging-Dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
BounharAbdelaziz/Face-Gender-Swap | 2023-10-08T16:25:03.000Z | [
"region:us"
] | BounharAbdelaziz | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': domain_F
'1': domain_M
splits:
- name: train
num_bytes: 20710300480.468
num_examples: 51604
download_size: 20737281406
dataset_size: 20710300480.468
---
# Dataset Card for "Face-Gender-Swap"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
osanseviero/ag_misclassifications | 2023-10-08T15:57:20.000Z | [
"region:us"
] | osanseviero | null | null | null | 0 | 0 | This dataset contains a slice of 200 samples from the [AG News](https://huggingface.co/datasets/ag_news) dataset (test split).
The picked 200 samples are potential misclassifications of the original test data.
Approach
* Fine-tune DistilBERT with 10k samples from the training data (out of 120k)
* Do a forward pass with the model, storing the loss
* Sort the samples based on the loss
This is a repository for demonstration purposes |
BangumiBase/rurounikenshin2023 | 2023-10-08T18:47:36.000Z | [
"size_categories:1K<n<10K",
"license:mit",
"art",
"region:us"
] | BangumiBase | null | null | null | 0 | 0 | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Rurouni Kenshin (2023)
This is the image base of bangumi Rurouni Kenshin (2023), we detected 27 characters, 3015 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 363 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 376 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 30 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 58 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 209 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 81 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 17 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 86 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 122 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 338 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 34 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 43 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 27 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 27 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 67 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 49 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 7 | [Download](16/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 17 | 189 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 21 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 16 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 688 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 13 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 14 | [Download](22/dataset.zip) |  |  |  |  |  |  |  |  |
| 23 | 31 | [Download](23/dataset.zip) |  |  |  |  |  |  |  |  |
| 24 | 33 | [Download](24/dataset.zip) |  |  |  |  |  |  |  |  |
| 25 | 9 | [Download](25/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 67 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
Praghxx/Rick | 2023-10-10T14:36:20.000Z | [
"license:openrail",
"region:us"
] | Praghxx | null | null | null | 0 | 0 | ---
license: openrail
---
|
open-llm-leaderboard/details_Charlie911__vicuna-7b-v1.5-lora-mixed-datasets-time-unit | 2023-10-08T16:14:40.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Charlie911/vicuna-7b-v1.5-lora-mixed-datasets-time-unit
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Charlie911/vicuna-7b-v1.5-lora-mixed-datasets-time-unit](https://huggingface.co/Charlie911/vicuna-7b-v1.5-lora-mixed-datasets-time-unit)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Charlie911__vicuna-7b-v1.5-lora-mixed-datasets-time-unit\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-08T16:13:20.175189](https://huggingface.co/datasets/open-llm-leaderboard/details_Charlie911__vicuna-7b-v1.5-lora-mixed-datasets-time-unit/blob/main/results_2023-10-08T16-13-20.175189.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4967906759463178,\n\
\ \"acc_stderr\": 0.035093749842694276,\n \"acc_norm\": 0.5007669702461871,\n\
\ \"acc_norm_stderr\": 0.03508171542248901,\n \"mc1\": 0.2631578947368421,\n\
\ \"mc1_stderr\": 0.015415241740237022,\n \"mc2\": 0.40325349902822055,\n\
\ \"mc2_stderr\": 0.013838198652960687\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.48208191126279865,\n \"acc_stderr\": 0.01460200558549098,\n\
\ \"acc_norm\": 0.5179180887372014,\n \"acc_norm_stderr\": 0.014602005585490975\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5653256323441546,\n\
\ \"acc_stderr\": 0.004947010937455343,\n \"acc_norm\": 0.7640908185620394,\n\
\ \"acc_norm_stderr\": 0.004236980145344305\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720683,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720683\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45185185185185184,\n\
\ \"acc_stderr\": 0.04299268905480863,\n \"acc_norm\": 0.45185185185185184,\n\
\ \"acc_norm_stderr\": 0.04299268905480863\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.04063302731486671,\n\
\ \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.04063302731486671\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5320754716981132,\n \"acc_stderr\": 0.03070948699255655,\n\
\ \"acc_norm\": 0.5320754716981132,\n \"acc_norm_stderr\": 0.03070948699255655\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4930555555555556,\n\
\ \"acc_stderr\": 0.041808067502949374,\n \"acc_norm\": 0.4930555555555556,\n\
\ \"acc_norm_stderr\": 0.041808067502949374\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\"\
: 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4277456647398844,\n\
\ \"acc_stderr\": 0.037724468575180255,\n \"acc_norm\": 0.4277456647398844,\n\
\ \"acc_norm_stderr\": 0.037724468575180255\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.17647058823529413,\n \"acc_stderr\": 0.0379328118530781,\n\
\ \"acc_norm\": 0.17647058823529413,\n \"acc_norm_stderr\": 0.0379328118530781\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n\
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.451063829787234,\n \"acc_stderr\": 0.032529096196131965,\n\
\ \"acc_norm\": 0.451063829787234,\n \"acc_norm_stderr\": 0.032529096196131965\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.04142439719489362,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.04142439719489362\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4689655172413793,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.4689655172413793,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3148148148148148,\n \"acc_stderr\": 0.023919984164047736,\n \"\
acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.023919984164047736\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30952380952380953,\n\
\ \"acc_stderr\": 0.041349130183033156,\n \"acc_norm\": 0.30952380952380953,\n\
\ \"acc_norm_stderr\": 0.041349130183033156\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421296,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421296\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.5548387096774193,\n \"acc_stderr\": 0.028272410186214906,\n \"\
acc_norm\": 0.5548387096774193,\n \"acc_norm_stderr\": 0.028272410186214906\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.3399014778325123,\n \"acc_stderr\": 0.033327690684107895,\n \"\
acc_norm\": 0.3399014778325123,\n \"acc_norm_stderr\": 0.033327690684107895\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6121212121212121,\n \"acc_stderr\": 0.038049136539710114,\n\
\ \"acc_norm\": 0.6121212121212121,\n \"acc_norm_stderr\": 0.038049136539710114\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6161616161616161,\n \"acc_stderr\": 0.034648816750163396,\n \"\
acc_norm\": 0.6161616161616161,\n \"acc_norm_stderr\": 0.034648816750163396\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.694300518134715,\n \"acc_stderr\": 0.033248379397581594,\n\
\ \"acc_norm\": 0.694300518134715,\n \"acc_norm_stderr\": 0.033248379397581594\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.44871794871794873,\n \"acc_stderr\": 0.025217315184846482,\n\
\ \"acc_norm\": 0.44871794871794873,\n \"acc_norm_stderr\": 0.025217315184846482\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26296296296296295,\n \"acc_stderr\": 0.026842057873833706,\n \
\ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.026842057873833706\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.4789915966386555,\n \"acc_stderr\": 0.03244980849990029,\n \
\ \"acc_norm\": 0.4789915966386555,\n \"acc_norm_stderr\": 0.03244980849990029\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.03757949922943342,\n \"acc_norm\"\
: 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943342\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.689908256880734,\n\
\ \"acc_stderr\": 0.019830849684439756,\n \"acc_norm\": 0.689908256880734,\n\
\ \"acc_norm_stderr\": 0.019830849684439756\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.4074074074074074,\n \"acc_stderr\": 0.03350991604696042,\n\
\ \"acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.03350991604696042\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6519607843137255,\n \"acc_stderr\": 0.03343311240488419,\n \"\
acc_norm\": 0.6519607843137255,\n \"acc_norm_stderr\": 0.03343311240488419\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6962025316455697,\n \"acc_stderr\": 0.0299366963871386,\n \
\ \"acc_norm\": 0.6962025316455697,\n \"acc_norm_stderr\": 0.0299366963871386\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5829596412556054,\n\
\ \"acc_stderr\": 0.03309266936071721,\n \"acc_norm\": 0.5829596412556054,\n\
\ \"acc_norm_stderr\": 0.03309266936071721\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6030534351145038,\n \"acc_stderr\": 0.04291135671009225,\n\
\ \"acc_norm\": 0.6030534351145038,\n \"acc_norm_stderr\": 0.04291135671009225\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5785123966942148,\n \"acc_stderr\": 0.04507732278775087,\n \"\
acc_norm\": 0.5785123966942148,\n \"acc_norm_stderr\": 0.04507732278775087\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5462962962962963,\n\
\ \"acc_stderr\": 0.04812917324536824,\n \"acc_norm\": 0.5462962962962963,\n\
\ \"acc_norm_stderr\": 0.04812917324536824\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.50920245398773,\n \"acc_stderr\": 0.03927705600787443,\n\
\ \"acc_norm\": 0.50920245398773,\n \"acc_norm_stderr\": 0.03927705600787443\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n\
\ \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n\
\ \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6699029126213593,\n \"acc_stderr\": 0.04656147110012349,\n\
\ \"acc_norm\": 0.6699029126213593,\n \"acc_norm_stderr\": 0.04656147110012349\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7521367521367521,\n\
\ \"acc_stderr\": 0.0282863240755644,\n \"acc_norm\": 0.7521367521367521,\n\
\ \"acc_norm_stderr\": 0.0282863240755644\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.55,\n \"acc_stderr\": 0.04999999999999999,\n \
\ \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.04999999999999999\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6794380587484036,\n\
\ \"acc_stderr\": 0.016688893310803747,\n \"acc_norm\": 0.6794380587484036,\n\
\ \"acc_norm_stderr\": 0.016688893310803747\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5086705202312138,\n \"acc_stderr\": 0.026915047355369818,\n\
\ \"acc_norm\": 0.5086705202312138,\n \"acc_norm_stderr\": 0.026915047355369818\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2636871508379888,\n\
\ \"acc_stderr\": 0.014736926383761987,\n \"acc_norm\": 0.2636871508379888,\n\
\ \"acc_norm_stderr\": 0.014736926383761987\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5784313725490197,\n \"acc_stderr\": 0.02827549015679146,\n\
\ \"acc_norm\": 0.5784313725490197,\n \"acc_norm_stderr\": 0.02827549015679146\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5884244372990354,\n\
\ \"acc_stderr\": 0.027950481494401273,\n \"acc_norm\": 0.5884244372990354,\n\
\ \"acc_norm_stderr\": 0.027950481494401273\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5339506172839507,\n \"acc_stderr\": 0.027756535257347666,\n\
\ \"acc_norm\": 0.5339506172839507,\n \"acc_norm_stderr\": 0.027756535257347666\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3900709219858156,\n \"acc_stderr\": 0.02909767559946393,\n \
\ \"acc_norm\": 0.3900709219858156,\n \"acc_norm_stderr\": 0.02909767559946393\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.363754889178618,\n\
\ \"acc_stderr\": 0.012286991879902882,\n \"acc_norm\": 0.363754889178618,\n\
\ \"acc_norm_stderr\": 0.012286991879902882\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5514705882352942,\n \"acc_stderr\": 0.030211479609121603,\n\
\ \"acc_norm\": 0.5514705882352942,\n \"acc_norm_stderr\": 0.030211479609121603\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4673202614379085,\n \"acc_stderr\": 0.020184583359102202,\n \
\ \"acc_norm\": 0.4673202614379085,\n \"acc_norm_stderr\": 0.020184583359102202\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5818181818181818,\n\
\ \"acc_stderr\": 0.04724577405731572,\n \"acc_norm\": 0.5818181818181818,\n\
\ \"acc_norm_stderr\": 0.04724577405731572\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6204081632653061,\n \"acc_stderr\": 0.031067211262872485,\n\
\ \"acc_norm\": 0.6204081632653061,\n \"acc_norm_stderr\": 0.031067211262872485\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6616915422885572,\n\
\ \"acc_stderr\": 0.03345563070339192,\n \"acc_norm\": 0.6616915422885572,\n\
\ \"acc_norm_stderr\": 0.03345563070339192\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42168674698795183,\n\
\ \"acc_stderr\": 0.03844453181770917,\n \"acc_norm\": 0.42168674698795183,\n\
\ \"acc_norm_stderr\": 0.03844453181770917\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.695906432748538,\n \"acc_stderr\": 0.0352821125824523,\n\
\ \"acc_norm\": 0.695906432748538,\n \"acc_norm_stderr\": 0.0352821125824523\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2631578947368421,\n\
\ \"mc1_stderr\": 0.015415241740237022,\n \"mc2\": 0.40325349902822055,\n\
\ \"mc2_stderr\": 0.013838198652960687\n }\n}\n```"
repo_url: https://huggingface.co/Charlie911/vicuna-7b-v1.5-lora-mixed-datasets-time-unit
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_08T16_13_20.175189
path:
- '**/details_harness|arc:challenge|25_2023-10-08T16-13-20.175189.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-08T16-13-20.175189.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_08T16_13_20.175189
path:
- '**/details_harness|hellaswag|10_2023-10-08T16-13-20.175189.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-08T16-13-20.175189.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_08T16_13_20.175189
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T16-13-20.175189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T16-13-20.175189.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T16-13-20.175189.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_08T16_13_20.175189
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T16-13-20.175189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T16-13-20.175189.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_08T16_13_20.175189
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T16-13-20.175189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T16-13-20.175189.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_08T16_13_20.175189
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T16-13-20.175189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T16-13-20.175189.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_08T16_13_20.175189
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T16-13-20.175189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T16-13-20.175189.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_08T16_13_20.175189
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T16-13-20.175189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T16-13-20.175189.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_08T16_13_20.175189
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T16-13-20.175189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T16-13-20.175189.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_08T16_13_20.175189
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T16-13-20.175189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T16-13-20.175189.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_08T16_13_20.175189
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T16-13-20.175189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T16-13-20.175189.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_08T16_13_20.175189
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T16-13-20.175189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T16-13-20.175189.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_08T16_13_20.175189
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T16-13-20.175189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T16-13-20.175189.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_08T16_13_20.175189
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T16-13-20.175189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T16-13-20.175189.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_08T16_13_20.175189
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T16-13-20.175189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T16-13-20.175189.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_08T16_13_20.175189
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T16-13-20.175189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T16-13-20.175189.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_08T16_13_20.175189
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T16-13-20.175189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T16-13-20.175189.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_08T16_13_20.175189
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T16-13-20.175189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T16-13-20.175189.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_08T16_13_20.175189
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T16-13-20.175189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T16-13-20.175189.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_08T16_13_20.175189
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T16-13-20.175189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T16-13-20.175189.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_08T16_13_20.175189
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T16-13-20.175189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T16-13-20.175189.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_08T16_13_20.175189
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T16-13-20.175189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T16-13-20.175189.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_08T16_13_20.175189
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T16-13-20.175189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T16-13-20.175189.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_08T16_13_20.175189
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T16-13-20.175189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T16-13-20.175189.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_08T16_13_20.175189
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T16-13-20.175189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T16-13-20.175189.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_08T16_13_20.175189
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T16-13-20.175189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T16-13-20.175189.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_08T16_13_20.175189
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T16-13-20.175189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T16-13-20.175189.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_08T16_13_20.175189
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T16-13-20.175189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T16-13-20.175189.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_08T16_13_20.175189
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T16-13-20.175189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T16-13-20.175189.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_08T16_13_20.175189
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T16-13-20.175189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T16-13-20.175189.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_08T16_13_20.175189
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T16-13-20.175189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T16-13-20.175189.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_08T16_13_20.175189
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T16-13-20.175189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T16-13-20.175189.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_08T16_13_20.175189
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T16-13-20.175189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T16-13-20.175189.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_08T16_13_20.175189
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T16-13-20.175189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T16-13-20.175189.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_08T16_13_20.175189
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T16-13-20.175189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T16-13-20.175189.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_08T16_13_20.175189
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T16-13-20.175189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T16-13-20.175189.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_08T16_13_20.175189
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T16-13-20.175189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T16-13-20.175189.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_08T16_13_20.175189
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T16-13-20.175189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T16-13-20.175189.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_08T16_13_20.175189
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T16-13-20.175189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T16-13-20.175189.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_08T16_13_20.175189
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T16-13-20.175189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T16-13-20.175189.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_08T16_13_20.175189
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T16-13-20.175189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T16-13-20.175189.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_08T16_13_20.175189
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-08T16-13-20.175189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-08T16-13-20.175189.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_08T16_13_20.175189
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T16-13-20.175189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T16-13-20.175189.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_08T16_13_20.175189
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T16-13-20.175189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T16-13-20.175189.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_08T16_13_20.175189
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T16-13-20.175189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T16-13-20.175189.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_08T16_13_20.175189
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T16-13-20.175189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T16-13-20.175189.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_08T16_13_20.175189
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T16-13-20.175189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T16-13-20.175189.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_08T16_13_20.175189
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T16-13-20.175189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T16-13-20.175189.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_08T16_13_20.175189
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T16-13-20.175189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T16-13-20.175189.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_08T16_13_20.175189
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T16-13-20.175189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T16-13-20.175189.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_08T16_13_20.175189
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T16-13-20.175189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T16-13-20.175189.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_08T16_13_20.175189
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T16-13-20.175189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T16-13-20.175189.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_08T16_13_20.175189
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T16-13-20.175189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T16-13-20.175189.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_08T16_13_20.175189
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T16-13-20.175189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T16-13-20.175189.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_08T16_13_20.175189
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T16-13-20.175189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T16-13-20.175189.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_08T16_13_20.175189
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T16-13-20.175189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T16-13-20.175189.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_08T16_13_20.175189
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T16-13-20.175189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T16-13-20.175189.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_08T16_13_20.175189
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T16-13-20.175189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T16-13-20.175189.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_08T16_13_20.175189
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T16-13-20.175189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T16-13-20.175189.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_08T16_13_20.175189
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T16-13-20.175189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T16-13-20.175189.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_08T16_13_20.175189
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-08T16-13-20.175189.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-08T16-13-20.175189.parquet'
- config_name: results
data_files:
- split: 2023_10_08T16_13_20.175189
path:
- results_2023-10-08T16-13-20.175189.parquet
- split: latest
path:
- results_2023-10-08T16-13-20.175189.parquet
---
# Dataset Card for Evaluation run of Charlie911/vicuna-7b-v1.5-lora-mixed-datasets-time-unit
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Charlie911/vicuna-7b-v1.5-lora-mixed-datasets-time-unit
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Charlie911/vicuna-7b-v1.5-lora-mixed-datasets-time-unit](https://huggingface.co/Charlie911/vicuna-7b-v1.5-lora-mixed-datasets-time-unit) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Charlie911__vicuna-7b-v1.5-lora-mixed-datasets-time-unit",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-08T16:13:20.175189](https://huggingface.co/datasets/open-llm-leaderboard/details_Charlie911__vicuna-7b-v1.5-lora-mixed-datasets-time-unit/blob/main/results_2023-10-08T16-13-20.175189.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4967906759463178,
"acc_stderr": 0.035093749842694276,
"acc_norm": 0.5007669702461871,
"acc_norm_stderr": 0.03508171542248901,
"mc1": 0.2631578947368421,
"mc1_stderr": 0.015415241740237022,
"mc2": 0.40325349902822055,
"mc2_stderr": 0.013838198652960687
},
"harness|arc:challenge|25": {
"acc": 0.48208191126279865,
"acc_stderr": 0.01460200558549098,
"acc_norm": 0.5179180887372014,
"acc_norm_stderr": 0.014602005585490975
},
"harness|hellaswag|10": {
"acc": 0.5653256323441546,
"acc_stderr": 0.004947010937455343,
"acc_norm": 0.7640908185620394,
"acc_norm_stderr": 0.004236980145344305
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720683,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720683
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45185185185185184,
"acc_stderr": 0.04299268905480863,
"acc_norm": 0.45185185185185184,
"acc_norm_stderr": 0.04299268905480863
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.04063302731486671,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.04063302731486671
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5320754716981132,
"acc_stderr": 0.03070948699255655,
"acc_norm": 0.5320754716981132,
"acc_norm_stderr": 0.03070948699255655
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4930555555555556,
"acc_stderr": 0.041808067502949374,
"acc_norm": 0.4930555555555556,
"acc_norm_stderr": 0.041808067502949374
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4277456647398844,
"acc_stderr": 0.037724468575180255,
"acc_norm": 0.4277456647398844,
"acc_norm_stderr": 0.037724468575180255
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.17647058823529413,
"acc_stderr": 0.0379328118530781,
"acc_norm": 0.17647058823529413,
"acc_norm_stderr": 0.0379328118530781
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.451063829787234,
"acc_stderr": 0.032529096196131965,
"acc_norm": 0.451063829787234,
"acc_norm_stderr": 0.032529096196131965
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.04142439719489362,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.04142439719489362
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4689655172413793,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.4689655172413793,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.023919984164047736,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.023919984164047736
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.041349130183033156,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.041349130183033156
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421296,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421296
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5548387096774193,
"acc_stderr": 0.028272410186214906,
"acc_norm": 0.5548387096774193,
"acc_norm_stderr": 0.028272410186214906
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3399014778325123,
"acc_stderr": 0.033327690684107895,
"acc_norm": 0.3399014778325123,
"acc_norm_stderr": 0.033327690684107895
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6121212121212121,
"acc_stderr": 0.038049136539710114,
"acc_norm": 0.6121212121212121,
"acc_norm_stderr": 0.038049136539710114
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6161616161616161,
"acc_stderr": 0.034648816750163396,
"acc_norm": 0.6161616161616161,
"acc_norm_stderr": 0.034648816750163396
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.694300518134715,
"acc_stderr": 0.033248379397581594,
"acc_norm": 0.694300518134715,
"acc_norm_stderr": 0.033248379397581594
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.44871794871794873,
"acc_stderr": 0.025217315184846482,
"acc_norm": 0.44871794871794873,
"acc_norm_stderr": 0.025217315184846482
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.026842057873833706,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.026842057873833706
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4789915966386555,
"acc_stderr": 0.03244980849990029,
"acc_norm": 0.4789915966386555,
"acc_norm_stderr": 0.03244980849990029
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943342,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943342
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.689908256880734,
"acc_stderr": 0.019830849684439756,
"acc_norm": 0.689908256880734,
"acc_norm_stderr": 0.019830849684439756
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.03350991604696042,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.03350991604696042
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6519607843137255,
"acc_stderr": 0.03343311240488419,
"acc_norm": 0.6519607843137255,
"acc_norm_stderr": 0.03343311240488419
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6962025316455697,
"acc_stderr": 0.0299366963871386,
"acc_norm": 0.6962025316455697,
"acc_norm_stderr": 0.0299366963871386
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5829596412556054,
"acc_stderr": 0.03309266936071721,
"acc_norm": 0.5829596412556054,
"acc_norm_stderr": 0.03309266936071721
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6030534351145038,
"acc_stderr": 0.04291135671009225,
"acc_norm": 0.6030534351145038,
"acc_norm_stderr": 0.04291135671009225
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5785123966942148,
"acc_stderr": 0.04507732278775087,
"acc_norm": 0.5785123966942148,
"acc_norm_stderr": 0.04507732278775087
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.04812917324536824,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.04812917324536824
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.50920245398773,
"acc_stderr": 0.03927705600787443,
"acc_norm": 0.50920245398773,
"acc_norm_stderr": 0.03927705600787443
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.6699029126213593,
"acc_stderr": 0.04656147110012349,
"acc_norm": 0.6699029126213593,
"acc_norm_stderr": 0.04656147110012349
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7521367521367521,
"acc_stderr": 0.0282863240755644,
"acc_norm": 0.7521367521367521,
"acc_norm_stderr": 0.0282863240755644
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.55,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.55,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6794380587484036,
"acc_stderr": 0.016688893310803747,
"acc_norm": 0.6794380587484036,
"acc_norm_stderr": 0.016688893310803747
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5086705202312138,
"acc_stderr": 0.026915047355369818,
"acc_norm": 0.5086705202312138,
"acc_norm_stderr": 0.026915047355369818
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2636871508379888,
"acc_stderr": 0.014736926383761987,
"acc_norm": 0.2636871508379888,
"acc_norm_stderr": 0.014736926383761987
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5784313725490197,
"acc_stderr": 0.02827549015679146,
"acc_norm": 0.5784313725490197,
"acc_norm_stderr": 0.02827549015679146
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5884244372990354,
"acc_stderr": 0.027950481494401273,
"acc_norm": 0.5884244372990354,
"acc_norm_stderr": 0.027950481494401273
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5339506172839507,
"acc_stderr": 0.027756535257347666,
"acc_norm": 0.5339506172839507,
"acc_norm_stderr": 0.027756535257347666
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3900709219858156,
"acc_stderr": 0.02909767559946393,
"acc_norm": 0.3900709219858156,
"acc_norm_stderr": 0.02909767559946393
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.363754889178618,
"acc_stderr": 0.012286991879902882,
"acc_norm": 0.363754889178618,
"acc_norm_stderr": 0.012286991879902882
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5514705882352942,
"acc_stderr": 0.030211479609121603,
"acc_norm": 0.5514705882352942,
"acc_norm_stderr": 0.030211479609121603
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4673202614379085,
"acc_stderr": 0.020184583359102202,
"acc_norm": 0.4673202614379085,
"acc_norm_stderr": 0.020184583359102202
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5818181818181818,
"acc_stderr": 0.04724577405731572,
"acc_norm": 0.5818181818181818,
"acc_norm_stderr": 0.04724577405731572
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6204081632653061,
"acc_stderr": 0.031067211262872485,
"acc_norm": 0.6204081632653061,
"acc_norm_stderr": 0.031067211262872485
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6616915422885572,
"acc_stderr": 0.03345563070339192,
"acc_norm": 0.6616915422885572,
"acc_norm_stderr": 0.03345563070339192
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42168674698795183,
"acc_stderr": 0.03844453181770917,
"acc_norm": 0.42168674698795183,
"acc_norm_stderr": 0.03844453181770917
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.695906432748538,
"acc_stderr": 0.0352821125824523,
"acc_norm": 0.695906432748538,
"acc_norm_stderr": 0.0352821125824523
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2631578947368421,
"mc1_stderr": 0.015415241740237022,
"mc2": 0.40325349902822055,
"mc2_stderr": 0.013838198652960687
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
mfmezger/deu-medical_meadow_mmmlu | 2023-10-08T16:35:05.000Z | [
"region:us"
] | mfmezger | null | null | null | 0 | 0 | Entry not found |
mfmezger/deu-medical_meadow_pubmed_causal | 2023-10-08T16:35:30.000Z | [
"region:us"
] | mfmezger | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_LeoLM__leo-hessianai-7b | 2023-10-08T17:17:38.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of LeoLM/leo-hessianai-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [LeoLM/leo-hessianai-7b](https://huggingface.co/LeoLM/leo-hessianai-7b) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_LeoLM__leo-hessianai-7b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-08T17:16:14.181420](https://huggingface.co/datasets/open-llm-leaderboard/details_LeoLM__leo-hessianai-7b/blob/main/results_2023-10-08T17-16-14.181420.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.43163551318508864,\n\
\ \"acc_stderr\": 0.035236327448419084,\n \"acc_norm\": 0.43567372133468174,\n\
\ \"acc_norm_stderr\": 0.03522497637301333,\n \"mc1\": 0.2484700122399021,\n\
\ \"mc1_stderr\": 0.015127427096520681,\n \"mc2\": 0.3793584526849439,\n\
\ \"mc2_stderr\": 0.013664128519867424\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.47525597269624575,\n \"acc_stderr\": 0.014593487694937738,\n\
\ \"acc_norm\": 0.5196245733788396,\n \"acc_norm_stderr\": 0.014600132075947087\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5645289782911771,\n\
\ \"acc_stderr\": 0.004948052131344494,\n \"acc_norm\": 0.7584146584345748,\n\
\ \"acc_norm_stderr\": 0.004271694301395477\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45185185185185184,\n\
\ \"acc_stderr\": 0.04299268905480863,\n \"acc_norm\": 0.45185185185185184,\n\
\ \"acc_norm_stderr\": 0.04299268905480863\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4407894736842105,\n \"acc_stderr\": 0.04040311062490436,\n\
\ \"acc_norm\": 0.4407894736842105,\n \"acc_norm_stderr\": 0.04040311062490436\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.44150943396226416,\n \"acc_stderr\": 0.030561590426731833,\n\
\ \"acc_norm\": 0.44150943396226416,\n \"acc_norm_stderr\": 0.030561590426731833\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4097222222222222,\n\
\ \"acc_stderr\": 0.04112490974670787,\n \"acc_norm\": 0.4097222222222222,\n\
\ \"acc_norm_stderr\": 0.04112490974670787\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3872832369942196,\n\
\ \"acc_stderr\": 0.03714325906302065,\n \"acc_norm\": 0.3872832369942196,\n\
\ \"acc_norm_stderr\": 0.03714325906302065\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.04280105837364395,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.04280105837364395\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.56,\n\
\ \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.39574468085106385,\n \"acc_stderr\": 0.03196758697835361,\n\
\ \"acc_norm\": 0.39574468085106385,\n \"acc_norm_stderr\": 0.03196758697835361\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n\
\ \"acc_stderr\": 0.04303684033537315,\n \"acc_norm\": 0.2982456140350877,\n\
\ \"acc_norm_stderr\": 0.04303684033537315\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.43448275862068964,\n \"acc_stderr\": 0.041307408795554966,\n\
\ \"acc_norm\": 0.43448275862068964,\n \"acc_norm_stderr\": 0.041307408795554966\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.291005291005291,\n \"acc_stderr\": 0.023393826500484875,\n \"\
acc_norm\": 0.291005291005291,\n \"acc_norm_stderr\": 0.023393826500484875\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.040061680838488774,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.040061680838488774\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.44193548387096776,\n\
\ \"acc_stderr\": 0.02825155790684974,\n \"acc_norm\": 0.44193548387096776,\n\
\ \"acc_norm_stderr\": 0.02825155790684974\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.031785297106427496,\n\
\ \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.031785297106427496\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\"\
: 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5575757575757576,\n \"acc_stderr\": 0.038783721137112745,\n\
\ \"acc_norm\": 0.5575757575757576,\n \"acc_norm_stderr\": 0.038783721137112745\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.4696969696969697,\n \"acc_stderr\": 0.03555804051763929,\n \"\
acc_norm\": 0.4696969696969697,\n \"acc_norm_stderr\": 0.03555804051763929\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.5906735751295337,\n \"acc_stderr\": 0.03548608168860806,\n\
\ \"acc_norm\": 0.5906735751295337,\n \"acc_norm_stderr\": 0.03548608168860806\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.40512820512820513,\n \"acc_stderr\": 0.024890471769938145,\n\
\ \"acc_norm\": 0.40512820512820513,\n \"acc_norm_stderr\": 0.024890471769938145\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2814814814814815,\n \"acc_stderr\": 0.027420019350945273,\n \
\ \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.027420019350945273\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3403361344537815,\n \"acc_stderr\": 0.030778057422931673,\n\
\ \"acc_norm\": 0.3403361344537815,\n \"acc_norm_stderr\": 0.030778057422931673\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.037579499229433426,\n \"\
acc_norm\": 0.304635761589404,\n \"acc_norm_stderr\": 0.037579499229433426\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.5724770642201835,\n \"acc_stderr\": 0.021210910204300437,\n \"\
acc_norm\": 0.5724770642201835,\n \"acc_norm_stderr\": 0.021210910204300437\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.029531221160930918,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.029531221160930918\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.5392156862745098,\n \"acc_stderr\": 0.03498501649369527,\n\
\ \"acc_norm\": 0.5392156862745098,\n \"acc_norm_stderr\": 0.03498501649369527\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.5611814345991561,\n \"acc_stderr\": 0.032302649315470375,\n \
\ \"acc_norm\": 0.5611814345991561,\n \"acc_norm_stderr\": 0.032302649315470375\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4349775784753363,\n\
\ \"acc_stderr\": 0.033272833702713445,\n \"acc_norm\": 0.4349775784753363,\n\
\ \"acc_norm_stderr\": 0.033272833702713445\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.45038167938931295,\n \"acc_stderr\": 0.04363643698524779,\n\
\ \"acc_norm\": 0.45038167938931295,\n \"acc_norm_stderr\": 0.04363643698524779\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5785123966942148,\n \"acc_stderr\": 0.045077322787750874,\n \"\
acc_norm\": 0.5785123966942148,\n \"acc_norm_stderr\": 0.045077322787750874\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4537037037037037,\n\
\ \"acc_stderr\": 0.04812917324536823,\n \"acc_norm\": 0.4537037037037037,\n\
\ \"acc_norm_stderr\": 0.04812917324536823\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.4294478527607362,\n \"acc_stderr\": 0.038890666191127216,\n\
\ \"acc_norm\": 0.4294478527607362,\n \"acc_norm_stderr\": 0.038890666191127216\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04547960999764376,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04547960999764376\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5048543689320388,\n \"acc_stderr\": 0.04950504382128921,\n\
\ \"acc_norm\": 0.5048543689320388,\n \"acc_norm_stderr\": 0.04950504382128921\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5726495726495726,\n\
\ \"acc_stderr\": 0.03240847393516327,\n \"acc_norm\": 0.5726495726495726,\n\
\ \"acc_norm_stderr\": 0.03240847393516327\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5823754789272031,\n\
\ \"acc_stderr\": 0.017635637326951507,\n \"acc_norm\": 0.5823754789272031,\n\
\ \"acc_norm_stderr\": 0.017635637326951507\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.41329479768786126,\n \"acc_stderr\": 0.026511261369409244,\n\
\ \"acc_norm\": 0.41329479768786126,\n \"acc_norm_stderr\": 0.026511261369409244\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.4542483660130719,\n \"acc_stderr\": 0.028509807802626557,\n\
\ \"acc_norm\": 0.4542483660130719,\n \"acc_norm_stderr\": 0.028509807802626557\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5273311897106109,\n\
\ \"acc_stderr\": 0.02835563356832818,\n \"acc_norm\": 0.5273311897106109,\n\
\ \"acc_norm_stderr\": 0.02835563356832818\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.47530864197530864,\n \"acc_stderr\": 0.02778680093142745,\n\
\ \"acc_norm\": 0.47530864197530864,\n \"acc_norm_stderr\": 0.02778680093142745\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.34397163120567376,\n \"acc_stderr\": 0.028338017428611327,\n \
\ \"acc_norm\": 0.34397163120567376,\n \"acc_norm_stderr\": 0.028338017428611327\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3161668839634941,\n\
\ \"acc_stderr\": 0.011875780894386583,\n \"acc_norm\": 0.3161668839634941,\n\
\ \"acc_norm_stderr\": 0.011875780894386583\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4852941176470588,\n \"acc_stderr\": 0.03035969707904612,\n\
\ \"acc_norm\": 0.4852941176470588,\n \"acc_norm_stderr\": 0.03035969707904612\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4215686274509804,\n \"acc_stderr\": 0.019977422600227467,\n \
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.019977422600227467\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4636363636363636,\n\
\ \"acc_stderr\": 0.047764491623961985,\n \"acc_norm\": 0.4636363636363636,\n\
\ \"acc_norm_stderr\": 0.047764491623961985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5061224489795918,\n \"acc_stderr\": 0.03200682020163907,\n\
\ \"acc_norm\": 0.5061224489795918,\n \"acc_norm_stderr\": 0.03200682020163907\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5771144278606966,\n\
\ \"acc_stderr\": 0.034932317774212816,\n \"acc_norm\": 0.5771144278606966,\n\
\ \"acc_norm_stderr\": 0.034932317774212816\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \
\ \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3313253012048193,\n\
\ \"acc_stderr\": 0.036643147772880844,\n \"acc_norm\": 0.3313253012048193,\n\
\ \"acc_norm_stderr\": 0.036643147772880844\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6257309941520468,\n \"acc_stderr\": 0.03711601185389481,\n\
\ \"acc_norm\": 0.6257309941520468,\n \"acc_norm_stderr\": 0.03711601185389481\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2484700122399021,\n\
\ \"mc1_stderr\": 0.015127427096520681,\n \"mc2\": 0.3793584526849439,\n\
\ \"mc2_stderr\": 0.013664128519867424\n }\n}\n```"
repo_url: https://huggingface.co/LeoLM/leo-hessianai-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_08T17_16_14.181420
path:
- '**/details_harness|arc:challenge|25_2023-10-08T17-16-14.181420.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-08T17-16-14.181420.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_08T17_16_14.181420
path:
- '**/details_harness|hellaswag|10_2023-10-08T17-16-14.181420.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-08T17-16-14.181420.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_08T17_16_14.181420
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T17-16-14.181420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T17-16-14.181420.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T17-16-14.181420.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_08T17_16_14.181420
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T17-16-14.181420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T17-16-14.181420.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_08T17_16_14.181420
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T17-16-14.181420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T17-16-14.181420.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_08T17_16_14.181420
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T17-16-14.181420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T17-16-14.181420.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_08T17_16_14.181420
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T17-16-14.181420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T17-16-14.181420.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_08T17_16_14.181420
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T17-16-14.181420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T17-16-14.181420.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_08T17_16_14.181420
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T17-16-14.181420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T17-16-14.181420.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_08T17_16_14.181420
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T17-16-14.181420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T17-16-14.181420.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_08T17_16_14.181420
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T17-16-14.181420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T17-16-14.181420.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_08T17_16_14.181420
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T17-16-14.181420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T17-16-14.181420.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_08T17_16_14.181420
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T17-16-14.181420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T17-16-14.181420.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_08T17_16_14.181420
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T17-16-14.181420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T17-16-14.181420.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_08T17_16_14.181420
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T17-16-14.181420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T17-16-14.181420.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_08T17_16_14.181420
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T17-16-14.181420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T17-16-14.181420.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_08T17_16_14.181420
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T17-16-14.181420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T17-16-14.181420.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_08T17_16_14.181420
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T17-16-14.181420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T17-16-14.181420.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_08T17_16_14.181420
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T17-16-14.181420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T17-16-14.181420.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_08T17_16_14.181420
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T17-16-14.181420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T17-16-14.181420.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_08T17_16_14.181420
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T17-16-14.181420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T17-16-14.181420.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_08T17_16_14.181420
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T17-16-14.181420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T17-16-14.181420.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_08T17_16_14.181420
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T17-16-14.181420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T17-16-14.181420.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_08T17_16_14.181420
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T17-16-14.181420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T17-16-14.181420.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_08T17_16_14.181420
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T17-16-14.181420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T17-16-14.181420.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_08T17_16_14.181420
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T17-16-14.181420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T17-16-14.181420.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_08T17_16_14.181420
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T17-16-14.181420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T17-16-14.181420.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_08T17_16_14.181420
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T17-16-14.181420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T17-16-14.181420.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_08T17_16_14.181420
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T17-16-14.181420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T17-16-14.181420.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_08T17_16_14.181420
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T17-16-14.181420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T17-16-14.181420.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_08T17_16_14.181420
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T17-16-14.181420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T17-16-14.181420.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_08T17_16_14.181420
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T17-16-14.181420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T17-16-14.181420.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_08T17_16_14.181420
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T17-16-14.181420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T17-16-14.181420.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_08T17_16_14.181420
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T17-16-14.181420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T17-16-14.181420.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_08T17_16_14.181420
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T17-16-14.181420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T17-16-14.181420.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_08T17_16_14.181420
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T17-16-14.181420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T17-16-14.181420.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_08T17_16_14.181420
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T17-16-14.181420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T17-16-14.181420.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_08T17_16_14.181420
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T17-16-14.181420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T17-16-14.181420.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_08T17_16_14.181420
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T17-16-14.181420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T17-16-14.181420.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_08T17_16_14.181420
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T17-16-14.181420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T17-16-14.181420.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_08T17_16_14.181420
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T17-16-14.181420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T17-16-14.181420.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_08T17_16_14.181420
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-08T17-16-14.181420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-08T17-16-14.181420.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_08T17_16_14.181420
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T17-16-14.181420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T17-16-14.181420.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_08T17_16_14.181420
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T17-16-14.181420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T17-16-14.181420.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_08T17_16_14.181420
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T17-16-14.181420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T17-16-14.181420.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_08T17_16_14.181420
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T17-16-14.181420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T17-16-14.181420.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_08T17_16_14.181420
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T17-16-14.181420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T17-16-14.181420.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_08T17_16_14.181420
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T17-16-14.181420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T17-16-14.181420.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_08T17_16_14.181420
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T17-16-14.181420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T17-16-14.181420.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_08T17_16_14.181420
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T17-16-14.181420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T17-16-14.181420.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_08T17_16_14.181420
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T17-16-14.181420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T17-16-14.181420.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_08T17_16_14.181420
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T17-16-14.181420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T17-16-14.181420.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_08T17_16_14.181420
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T17-16-14.181420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T17-16-14.181420.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_08T17_16_14.181420
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T17-16-14.181420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T17-16-14.181420.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_08T17_16_14.181420
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T17-16-14.181420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T17-16-14.181420.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_08T17_16_14.181420
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T17-16-14.181420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T17-16-14.181420.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_08T17_16_14.181420
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T17-16-14.181420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T17-16-14.181420.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_08T17_16_14.181420
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T17-16-14.181420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T17-16-14.181420.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_08T17_16_14.181420
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T17-16-14.181420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T17-16-14.181420.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_08T17_16_14.181420
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T17-16-14.181420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T17-16-14.181420.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_08T17_16_14.181420
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-08T17-16-14.181420.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-08T17-16-14.181420.parquet'
- config_name: results
data_files:
- split: 2023_10_08T17_16_14.181420
path:
- results_2023-10-08T17-16-14.181420.parquet
- split: latest
path:
- results_2023-10-08T17-16-14.181420.parquet
---
# Dataset Card for Evaluation run of LeoLM/leo-hessianai-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/LeoLM/leo-hessianai-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [LeoLM/leo-hessianai-7b](https://huggingface.co/LeoLM/leo-hessianai-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_LeoLM__leo-hessianai-7b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-08T17:16:14.181420](https://huggingface.co/datasets/open-llm-leaderboard/details_LeoLM__leo-hessianai-7b/blob/main/results_2023-10-08T17-16-14.181420.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.43163551318508864,
"acc_stderr": 0.035236327448419084,
"acc_norm": 0.43567372133468174,
"acc_norm_stderr": 0.03522497637301333,
"mc1": 0.2484700122399021,
"mc1_stderr": 0.015127427096520681,
"mc2": 0.3793584526849439,
"mc2_stderr": 0.013664128519867424
},
"harness|arc:challenge|25": {
"acc": 0.47525597269624575,
"acc_stderr": 0.014593487694937738,
"acc_norm": 0.5196245733788396,
"acc_norm_stderr": 0.014600132075947087
},
"harness|hellaswag|10": {
"acc": 0.5645289782911771,
"acc_stderr": 0.004948052131344494,
"acc_norm": 0.7584146584345748,
"acc_norm_stderr": 0.004271694301395477
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45185185185185184,
"acc_stderr": 0.04299268905480863,
"acc_norm": 0.45185185185185184,
"acc_norm_stderr": 0.04299268905480863
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4407894736842105,
"acc_stderr": 0.04040311062490436,
"acc_norm": 0.4407894736842105,
"acc_norm_stderr": 0.04040311062490436
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.44150943396226416,
"acc_stderr": 0.030561590426731833,
"acc_norm": 0.44150943396226416,
"acc_norm_stderr": 0.030561590426731833
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4097222222222222,
"acc_stderr": 0.04112490974670787,
"acc_norm": 0.4097222222222222,
"acc_norm_stderr": 0.04112490974670787
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3872832369942196,
"acc_stderr": 0.03714325906302065,
"acc_norm": 0.3872832369942196,
"acc_norm_stderr": 0.03714325906302065
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.04280105837364395,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.04280105837364395
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.39574468085106385,
"acc_stderr": 0.03196758697835361,
"acc_norm": 0.39574468085106385,
"acc_norm_stderr": 0.03196758697835361
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.04303684033537315,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.04303684033537315
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.43448275862068964,
"acc_stderr": 0.041307408795554966,
"acc_norm": 0.43448275862068964,
"acc_norm_stderr": 0.041307408795554966
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.291005291005291,
"acc_stderr": 0.023393826500484875,
"acc_norm": 0.291005291005291,
"acc_norm_stderr": 0.023393826500484875
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.040061680838488774,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.040061680838488774
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.44193548387096776,
"acc_stderr": 0.02825155790684974,
"acc_norm": 0.44193548387096776,
"acc_norm_stderr": 0.02825155790684974
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.031785297106427496,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.031785297106427496
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5575757575757576,
"acc_stderr": 0.038783721137112745,
"acc_norm": 0.5575757575757576,
"acc_norm_stderr": 0.038783721137112745
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.4696969696969697,
"acc_stderr": 0.03555804051763929,
"acc_norm": 0.4696969696969697,
"acc_norm_stderr": 0.03555804051763929
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.5906735751295337,
"acc_stderr": 0.03548608168860806,
"acc_norm": 0.5906735751295337,
"acc_norm_stderr": 0.03548608168860806
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.40512820512820513,
"acc_stderr": 0.024890471769938145,
"acc_norm": 0.40512820512820513,
"acc_norm_stderr": 0.024890471769938145
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.027420019350945273,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.027420019350945273
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3403361344537815,
"acc_stderr": 0.030778057422931673,
"acc_norm": 0.3403361344537815,
"acc_norm_stderr": 0.030778057422931673
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.037579499229433426,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.037579499229433426
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5724770642201835,
"acc_stderr": 0.021210910204300437,
"acc_norm": 0.5724770642201835,
"acc_norm_stderr": 0.021210910204300437
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.25,
"acc_stderr": 0.029531221160930918,
"acc_norm": 0.25,
"acc_norm_stderr": 0.029531221160930918
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5392156862745098,
"acc_stderr": 0.03498501649369527,
"acc_norm": 0.5392156862745098,
"acc_norm_stderr": 0.03498501649369527
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5611814345991561,
"acc_stderr": 0.032302649315470375,
"acc_norm": 0.5611814345991561,
"acc_norm_stderr": 0.032302649315470375
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4349775784753363,
"acc_stderr": 0.033272833702713445,
"acc_norm": 0.4349775784753363,
"acc_norm_stderr": 0.033272833702713445
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.45038167938931295,
"acc_stderr": 0.04363643698524779,
"acc_norm": 0.45038167938931295,
"acc_norm_stderr": 0.04363643698524779
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5785123966942148,
"acc_stderr": 0.045077322787750874,
"acc_norm": 0.5785123966942148,
"acc_norm_stderr": 0.045077322787750874
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.4537037037037037,
"acc_stderr": 0.04812917324536823,
"acc_norm": 0.4537037037037037,
"acc_norm_stderr": 0.04812917324536823
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4294478527607362,
"acc_stderr": 0.038890666191127216,
"acc_norm": 0.4294478527607362,
"acc_norm_stderr": 0.038890666191127216
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04547960999764376,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04547960999764376
},
"harness|hendrycksTest-management|5": {
"acc": 0.5048543689320388,
"acc_stderr": 0.04950504382128921,
"acc_norm": 0.5048543689320388,
"acc_norm_stderr": 0.04950504382128921
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.5726495726495726,
"acc_stderr": 0.03240847393516327,
"acc_norm": 0.5726495726495726,
"acc_norm_stderr": 0.03240847393516327
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5823754789272031,
"acc_stderr": 0.017635637326951507,
"acc_norm": 0.5823754789272031,
"acc_norm_stderr": 0.017635637326951507
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.41329479768786126,
"acc_stderr": 0.026511261369409244,
"acc_norm": 0.41329479768786126,
"acc_norm_stderr": 0.026511261369409244
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4542483660130719,
"acc_stderr": 0.028509807802626557,
"acc_norm": 0.4542483660130719,
"acc_norm_stderr": 0.028509807802626557
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5273311897106109,
"acc_stderr": 0.02835563356832818,
"acc_norm": 0.5273311897106109,
"acc_norm_stderr": 0.02835563356832818
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.47530864197530864,
"acc_stderr": 0.02778680093142745,
"acc_norm": 0.47530864197530864,
"acc_norm_stderr": 0.02778680093142745
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.34397163120567376,
"acc_stderr": 0.028338017428611327,
"acc_norm": 0.34397163120567376,
"acc_norm_stderr": 0.028338017428611327
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3161668839634941,
"acc_stderr": 0.011875780894386583,
"acc_norm": 0.3161668839634941,
"acc_norm_stderr": 0.011875780894386583
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4852941176470588,
"acc_stderr": 0.03035969707904612,
"acc_norm": 0.4852941176470588,
"acc_norm_stderr": 0.03035969707904612
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.019977422600227467,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.019977422600227467
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.4636363636363636,
"acc_stderr": 0.047764491623961985,
"acc_norm": 0.4636363636363636,
"acc_norm_stderr": 0.047764491623961985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5061224489795918,
"acc_stderr": 0.03200682020163907,
"acc_norm": 0.5061224489795918,
"acc_norm_stderr": 0.03200682020163907
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5771144278606966,
"acc_stderr": 0.034932317774212816,
"acc_norm": 0.5771144278606966,
"acc_norm_stderr": 0.034932317774212816
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3313253012048193,
"acc_stderr": 0.036643147772880844,
"acc_norm": 0.3313253012048193,
"acc_norm_stderr": 0.036643147772880844
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6257309941520468,
"acc_stderr": 0.03711601185389481,
"acc_norm": 0.6257309941520468,
"acc_norm_stderr": 0.03711601185389481
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2484700122399021,
"mc1_stderr": 0.015127427096520681,
"mc2": 0.3793584526849439,
"mc2_stderr": 0.013664128519867424
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
mfidabel/dscovr_kp_data_2016_2023 | 2023-10-08T17:22:05.000Z | [
"license:mit",
"region:us"
] | mfidabel | null | null | null | 0 | 0 | ---
license: mit
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: TIMESTAMP
dtype: string
- name: KP_GRAL
dtype: float64
- name: GSE_X
dtype: float64
- name: GSE_Y
dtype: float64
- name: GSE_Z
dtype: float64
- name: CUADRADO
dtype: float64
- name: RAW_4
dtype: float64
- name: RAW_5
dtype: float64
- name: RAW_6
dtype: float64
- name: RAW_7
dtype: float64
- name: RAW_8
dtype: float64
- name: RAW_9
dtype: float64
- name: RAW_10
dtype: float64
- name: RAW_11
dtype: float64
- name: RAW_12
dtype: float64
- name: RAW_13
dtype: float64
- name: RAW_14
dtype: float64
- name: RAW_15
dtype: float64
- name: RAW_16
dtype: float64
- name: RAW_17
dtype: float64
- name: RAW_18
dtype: float64
- name: RAW_19
dtype: float64
- name: RAW_20
dtype: float64
- name: RAW_21
dtype: float64
- name: RAW_22
dtype: float64
- name: RAW_23
dtype: float64
- name: RAW_24
dtype: float64
splits:
- name: train
num_bytes: 762145515
num_examples: 3277440
download_size: 498355302
dataset_size: 762145515
---
|
0-hero/prompt-perfect | 2023-10-08T17:27:15.000Z | [
"region:us"
] | 0-hero | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_pankajmathur__orca_mini_v3_13b | 2023-10-08T17:28:40.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of pankajmathur/orca_mini_v3_13b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [pankajmathur/orca_mini_v3_13b](https://huggingface.co/pankajmathur/orca_mini_v3_13b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_pankajmathur__orca_mini_v3_13b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-08T17:27:15.323068](https://huggingface.co/datasets/open-llm-leaderboard/details_pankajmathur__orca_mini_v3_13b/blob/main/results_2023-10-08T17-27-15.323068.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5668380558829594,\n\
\ \"acc_stderr\": 0.034254528765444284,\n \"acc_norm\": 0.5706935928296689,\n\
\ \"acc_norm_stderr\": 0.03423366818930818,\n \"mc1\": 0.3806609547123623,\n\
\ \"mc1_stderr\": 0.016997627871907926,\n \"mc2\": 0.5180548254049244,\n\
\ \"mc2_stderr\": 0.015711587821782075\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6023890784982935,\n \"acc_stderr\": 0.014301752223279543,\n\
\ \"acc_norm\": 0.6313993174061433,\n \"acc_norm_stderr\": 0.014097810678042192\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6250746863174667,\n\
\ \"acc_stderr\": 0.004831142570475503,\n \"acc_norm\": 0.823541127265485,\n\
\ \"acc_norm_stderr\": 0.003804310123682779\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4666666666666667,\n\
\ \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.4666666666666667,\n\
\ \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5526315789473685,\n \"acc_stderr\": 0.0404633688397825,\n\
\ \"acc_norm\": 0.5526315789473685,\n \"acc_norm_stderr\": 0.0404633688397825\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6150943396226415,\n \"acc_stderr\": 0.02994649856769995,\n\
\ \"acc_norm\": 0.6150943396226415,\n \"acc_norm_stderr\": 0.02994649856769995\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5763888888888888,\n\
\ \"acc_stderr\": 0.041321250197233685,\n \"acc_norm\": 0.5763888888888888,\n\
\ \"acc_norm_stderr\": 0.041321250197233685\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n\
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5260115606936416,\n\
\ \"acc_stderr\": 0.038073017265045125,\n \"acc_norm\": 0.5260115606936416,\n\
\ \"acc_norm_stderr\": 0.038073017265045125\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.04576665403207763,\n\
\ \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.04576665403207763\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4851063829787234,\n \"acc_stderr\": 0.032671518489247764,\n\
\ \"acc_norm\": 0.4851063829787234,\n \"acc_norm_stderr\": 0.032671518489247764\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.04227054451232199,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.04227054451232199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30687830687830686,\n \"acc_stderr\": 0.023752928712112147,\n \"\
acc_norm\": 0.30687830687830686,\n \"acc_norm_stderr\": 0.023752928712112147\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3412698412698413,\n\
\ \"acc_stderr\": 0.04240799327574925,\n \"acc_norm\": 0.3412698412698413,\n\
\ \"acc_norm_stderr\": 0.04240799327574925\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.635483870967742,\n\
\ \"acc_stderr\": 0.027379871229943252,\n \"acc_norm\": 0.635483870967742,\n\
\ \"acc_norm_stderr\": 0.027379871229943252\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4433497536945813,\n \"acc_stderr\": 0.03495334582162934,\n\
\ \"acc_norm\": 0.4433497536945813,\n \"acc_norm_stderr\": 0.03495334582162934\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\"\
: 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.036085410115739666,\n\
\ \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.036085410115739666\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7373737373737373,\n \"acc_stderr\": 0.03135305009533086,\n \"\
acc_norm\": 0.7373737373737373,\n \"acc_norm_stderr\": 0.03135305009533086\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8082901554404145,\n \"acc_stderr\": 0.028408953626245265,\n\
\ \"acc_norm\": 0.8082901554404145,\n \"acc_norm_stderr\": 0.028408953626245265\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5666666666666667,\n \"acc_stderr\": 0.025124653525885117,\n\
\ \"acc_norm\": 0.5666666666666667,\n \"acc_norm_stderr\": 0.025124653525885117\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2851851851851852,\n \"acc_stderr\": 0.027528599210340496,\n \
\ \"acc_norm\": 0.2851851851851852,\n \"acc_norm_stderr\": 0.027528599210340496\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6050420168067226,\n \"acc_stderr\": 0.03175367846096626,\n \
\ \"acc_norm\": 0.6050420168067226,\n \"acc_norm_stderr\": 0.03175367846096626\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7541284403669725,\n \"acc_stderr\": 0.01846194096870843,\n \"\
acc_norm\": 0.7541284403669725,\n \"acc_norm_stderr\": 0.01846194096870843\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.49074074074074076,\n \"acc_stderr\": 0.03409386946992699,\n \"\
acc_norm\": 0.49074074074074076,\n \"acc_norm_stderr\": 0.03409386946992699\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7549019607843137,\n \"acc_stderr\": 0.030190282453501947,\n \"\
acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.030190282453501947\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7426160337552743,\n \"acc_stderr\": 0.0284588209914603,\n \
\ \"acc_norm\": 0.7426160337552743,\n \"acc_norm_stderr\": 0.0284588209914603\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6636771300448431,\n\
\ \"acc_stderr\": 0.031708824268455005,\n \"acc_norm\": 0.6636771300448431,\n\
\ \"acc_norm_stderr\": 0.031708824268455005\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6412213740458015,\n \"acc_stderr\": 0.04206739313864908,\n\
\ \"acc_norm\": 0.6412213740458015,\n \"acc_norm_stderr\": 0.04206739313864908\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\"\
: 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\
\ \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n\
\ \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6932515337423313,\n \"acc_stderr\": 0.036230899157241474,\n\
\ \"acc_norm\": 0.6932515337423313,\n \"acc_norm_stderr\": 0.036230899157241474\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3392857142857143,\n\
\ \"acc_stderr\": 0.04493949068613539,\n \"acc_norm\": 0.3392857142857143,\n\
\ \"acc_norm_stderr\": 0.04493949068613539\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7991452991452992,\n\
\ \"acc_stderr\": 0.026246772946890484,\n \"acc_norm\": 0.7991452991452992,\n\
\ \"acc_norm_stderr\": 0.026246772946890484\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7637292464878672,\n\
\ \"acc_stderr\": 0.015190473717037498,\n \"acc_norm\": 0.7637292464878672,\n\
\ \"acc_norm_stderr\": 0.015190473717037498\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6271676300578035,\n \"acc_stderr\": 0.02603389061357628,\n\
\ \"acc_norm\": 0.6271676300578035,\n \"acc_norm_stderr\": 0.02603389061357628\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3664804469273743,\n\
\ \"acc_stderr\": 0.016115235504865464,\n \"acc_norm\": 0.3664804469273743,\n\
\ \"acc_norm_stderr\": 0.016115235504865464\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.027914055510468008,\n\
\ \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.027914055510468008\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6527331189710611,\n\
\ \"acc_stderr\": 0.027040745502307336,\n \"acc_norm\": 0.6527331189710611,\n\
\ \"acc_norm_stderr\": 0.027040745502307336\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6388888888888888,\n \"acc_stderr\": 0.02672586880910079,\n\
\ \"acc_norm\": 0.6388888888888888,\n \"acc_norm_stderr\": 0.02672586880910079\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4219858156028369,\n \"acc_stderr\": 0.029462189233370597,\n \
\ \"acc_norm\": 0.4219858156028369,\n \"acc_norm_stderr\": 0.029462189233370597\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42698826597131684,\n\
\ \"acc_stderr\": 0.012633353557534421,\n \"acc_norm\": 0.42698826597131684,\n\
\ \"acc_norm_stderr\": 0.012633353557534421\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5367647058823529,\n \"acc_stderr\": 0.03029061918048569,\n\
\ \"acc_norm\": 0.5367647058823529,\n \"acc_norm_stderr\": 0.03029061918048569\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5751633986928104,\n \"acc_stderr\": 0.019997973035458333,\n \
\ \"acc_norm\": 0.5751633986928104,\n \"acc_norm_stderr\": 0.019997973035458333\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n\
\ \"acc_stderr\": 0.046534298079135075,\n \"acc_norm\": 0.6181818181818182,\n\
\ \"acc_norm_stderr\": 0.046534298079135075\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6448979591836734,\n \"acc_stderr\": 0.030635655150387638,\n\
\ \"acc_norm\": 0.6448979591836734,\n \"acc_norm_stderr\": 0.030635655150387638\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6169154228855721,\n\
\ \"acc_stderr\": 0.03437519337338252,\n \"acc_norm\": 0.6169154228855721,\n\
\ \"acc_norm_stderr\": 0.03437519337338252\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4457831325301205,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.4457831325301205,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7719298245614035,\n \"acc_stderr\": 0.032180937956023566,\n\
\ \"acc_norm\": 0.7719298245614035,\n \"acc_norm_stderr\": 0.032180937956023566\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3806609547123623,\n\
\ \"mc1_stderr\": 0.016997627871907926,\n \"mc2\": 0.5180548254049244,\n\
\ \"mc2_stderr\": 0.015711587821782075\n }\n}\n```"
repo_url: https://huggingface.co/pankajmathur/orca_mini_v3_13b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_08T17_27_15.323068
path:
- '**/details_harness|arc:challenge|25_2023-10-08T17-27-15.323068.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-08T17-27-15.323068.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_08T17_27_15.323068
path:
- '**/details_harness|hellaswag|10_2023-10-08T17-27-15.323068.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-08T17-27-15.323068.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_08T17_27_15.323068
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T17-27-15.323068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T17-27-15.323068.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T17-27-15.323068.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_08T17_27_15.323068
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T17-27-15.323068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T17-27-15.323068.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_08T17_27_15.323068
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T17-27-15.323068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T17-27-15.323068.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_08T17_27_15.323068
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T17-27-15.323068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T17-27-15.323068.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_08T17_27_15.323068
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T17-27-15.323068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T17-27-15.323068.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_08T17_27_15.323068
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T17-27-15.323068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T17-27-15.323068.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_08T17_27_15.323068
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T17-27-15.323068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T17-27-15.323068.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_08T17_27_15.323068
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T17-27-15.323068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T17-27-15.323068.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_08T17_27_15.323068
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T17-27-15.323068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T17-27-15.323068.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_08T17_27_15.323068
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T17-27-15.323068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T17-27-15.323068.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_08T17_27_15.323068
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T17-27-15.323068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T17-27-15.323068.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_08T17_27_15.323068
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T17-27-15.323068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T17-27-15.323068.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_08T17_27_15.323068
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T17-27-15.323068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T17-27-15.323068.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_08T17_27_15.323068
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T17-27-15.323068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T17-27-15.323068.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_08T17_27_15.323068
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T17-27-15.323068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T17-27-15.323068.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_08T17_27_15.323068
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T17-27-15.323068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T17-27-15.323068.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_08T17_27_15.323068
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T17-27-15.323068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T17-27-15.323068.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_08T17_27_15.323068
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T17-27-15.323068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T17-27-15.323068.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_08T17_27_15.323068
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T17-27-15.323068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T17-27-15.323068.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_08T17_27_15.323068
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T17-27-15.323068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T17-27-15.323068.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_08T17_27_15.323068
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T17-27-15.323068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T17-27-15.323068.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_08T17_27_15.323068
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T17-27-15.323068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T17-27-15.323068.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_08T17_27_15.323068
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T17-27-15.323068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T17-27-15.323068.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_08T17_27_15.323068
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T17-27-15.323068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T17-27-15.323068.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_08T17_27_15.323068
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T17-27-15.323068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T17-27-15.323068.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_08T17_27_15.323068
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T17-27-15.323068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T17-27-15.323068.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_08T17_27_15.323068
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T17-27-15.323068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T17-27-15.323068.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_08T17_27_15.323068
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T17-27-15.323068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T17-27-15.323068.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_08T17_27_15.323068
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T17-27-15.323068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T17-27-15.323068.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_08T17_27_15.323068
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T17-27-15.323068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T17-27-15.323068.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_08T17_27_15.323068
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T17-27-15.323068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T17-27-15.323068.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_08T17_27_15.323068
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T17-27-15.323068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T17-27-15.323068.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_08T17_27_15.323068
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T17-27-15.323068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T17-27-15.323068.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_08T17_27_15.323068
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T17-27-15.323068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T17-27-15.323068.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_08T17_27_15.323068
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T17-27-15.323068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T17-27-15.323068.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_08T17_27_15.323068
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T17-27-15.323068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T17-27-15.323068.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_08T17_27_15.323068
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T17-27-15.323068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T17-27-15.323068.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_08T17_27_15.323068
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T17-27-15.323068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T17-27-15.323068.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_08T17_27_15.323068
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T17-27-15.323068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T17-27-15.323068.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_08T17_27_15.323068
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-08T17-27-15.323068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-08T17-27-15.323068.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_08T17_27_15.323068
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T17-27-15.323068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T17-27-15.323068.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_08T17_27_15.323068
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T17-27-15.323068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T17-27-15.323068.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_08T17_27_15.323068
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T17-27-15.323068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T17-27-15.323068.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_08T17_27_15.323068
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T17-27-15.323068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T17-27-15.323068.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_08T17_27_15.323068
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T17-27-15.323068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T17-27-15.323068.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_08T17_27_15.323068
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T17-27-15.323068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T17-27-15.323068.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_08T17_27_15.323068
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T17-27-15.323068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T17-27-15.323068.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_08T17_27_15.323068
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T17-27-15.323068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T17-27-15.323068.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_08T17_27_15.323068
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T17-27-15.323068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T17-27-15.323068.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_08T17_27_15.323068
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T17-27-15.323068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T17-27-15.323068.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_08T17_27_15.323068
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T17-27-15.323068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T17-27-15.323068.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_08T17_27_15.323068
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T17-27-15.323068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T17-27-15.323068.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_08T17_27_15.323068
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T17-27-15.323068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T17-27-15.323068.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_08T17_27_15.323068
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T17-27-15.323068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T17-27-15.323068.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_08T17_27_15.323068
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T17-27-15.323068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T17-27-15.323068.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_08T17_27_15.323068
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T17-27-15.323068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T17-27-15.323068.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_08T17_27_15.323068
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T17-27-15.323068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T17-27-15.323068.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_08T17_27_15.323068
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T17-27-15.323068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T17-27-15.323068.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_08T17_27_15.323068
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-08T17-27-15.323068.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-08T17-27-15.323068.parquet'
- config_name: results
data_files:
- split: 2023_10_08T17_27_15.323068
path:
- results_2023-10-08T17-27-15.323068.parquet
- split: latest
path:
- results_2023-10-08T17-27-15.323068.parquet
---
# Dataset Card for Evaluation run of pankajmathur/orca_mini_v3_13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/pankajmathur/orca_mini_v3_13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [pankajmathur/orca_mini_v3_13b](https://huggingface.co/pankajmathur/orca_mini_v3_13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_pankajmathur__orca_mini_v3_13b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-08T17:27:15.323068](https://huggingface.co/datasets/open-llm-leaderboard/details_pankajmathur__orca_mini_v3_13b/blob/main/results_2023-10-08T17-27-15.323068.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5668380558829594,
"acc_stderr": 0.034254528765444284,
"acc_norm": 0.5706935928296689,
"acc_norm_stderr": 0.03423366818930818,
"mc1": 0.3806609547123623,
"mc1_stderr": 0.016997627871907926,
"mc2": 0.5180548254049244,
"mc2_stderr": 0.015711587821782075
},
"harness|arc:challenge|25": {
"acc": 0.6023890784982935,
"acc_stderr": 0.014301752223279543,
"acc_norm": 0.6313993174061433,
"acc_norm_stderr": 0.014097810678042192
},
"harness|hellaswag|10": {
"acc": 0.6250746863174667,
"acc_stderr": 0.004831142570475503,
"acc_norm": 0.823541127265485,
"acc_norm_stderr": 0.003804310123682779
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5526315789473685,
"acc_stderr": 0.0404633688397825,
"acc_norm": 0.5526315789473685,
"acc_norm_stderr": 0.0404633688397825
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6150943396226415,
"acc_stderr": 0.02994649856769995,
"acc_norm": 0.6150943396226415,
"acc_norm_stderr": 0.02994649856769995
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5763888888888888,
"acc_stderr": 0.041321250197233685,
"acc_norm": 0.5763888888888888,
"acc_norm_stderr": 0.041321250197233685
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5260115606936416,
"acc_stderr": 0.038073017265045125,
"acc_norm": 0.5260115606936416,
"acc_norm_stderr": 0.038073017265045125
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.04576665403207763,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.04576665403207763
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4851063829787234,
"acc_stderr": 0.032671518489247764,
"acc_norm": 0.4851063829787234,
"acc_norm_stderr": 0.032671518489247764
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.04227054451232199,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.04227054451232199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30687830687830686,
"acc_stderr": 0.023752928712112147,
"acc_norm": 0.30687830687830686,
"acc_norm_stderr": 0.023752928712112147
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.04240799327574925,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.04240799327574925
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.635483870967742,
"acc_stderr": 0.027379871229943252,
"acc_norm": 0.635483870967742,
"acc_norm_stderr": 0.027379871229943252
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4433497536945813,
"acc_stderr": 0.03495334582162934,
"acc_norm": 0.4433497536945813,
"acc_norm_stderr": 0.03495334582162934
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.036085410115739666,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.036085410115739666
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7373737373737373,
"acc_stderr": 0.03135305009533086,
"acc_norm": 0.7373737373737373,
"acc_norm_stderr": 0.03135305009533086
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8082901554404145,
"acc_stderr": 0.028408953626245265,
"acc_norm": 0.8082901554404145,
"acc_norm_stderr": 0.028408953626245265
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5666666666666667,
"acc_stderr": 0.025124653525885117,
"acc_norm": 0.5666666666666667,
"acc_norm_stderr": 0.025124653525885117
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2851851851851852,
"acc_stderr": 0.027528599210340496,
"acc_norm": 0.2851851851851852,
"acc_norm_stderr": 0.027528599210340496
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6050420168067226,
"acc_stderr": 0.03175367846096626,
"acc_norm": 0.6050420168067226,
"acc_norm_stderr": 0.03175367846096626
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7541284403669725,
"acc_stderr": 0.01846194096870843,
"acc_norm": 0.7541284403669725,
"acc_norm_stderr": 0.01846194096870843
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.03409386946992699,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.03409386946992699
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.030190282453501947,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.030190282453501947
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7426160337552743,
"acc_stderr": 0.0284588209914603,
"acc_norm": 0.7426160337552743,
"acc_norm_stderr": 0.0284588209914603
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6636771300448431,
"acc_stderr": 0.031708824268455005,
"acc_norm": 0.6636771300448431,
"acc_norm_stderr": 0.031708824268455005
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6412213740458015,
"acc_stderr": 0.04206739313864908,
"acc_norm": 0.6412213740458015,
"acc_norm_stderr": 0.04206739313864908
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302872,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302872
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6932515337423313,
"acc_stderr": 0.036230899157241474,
"acc_norm": 0.6932515337423313,
"acc_norm_stderr": 0.036230899157241474
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3392857142857143,
"acc_stderr": 0.04493949068613539,
"acc_norm": 0.3392857142857143,
"acc_norm_stderr": 0.04493949068613539
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7991452991452992,
"acc_stderr": 0.026246772946890484,
"acc_norm": 0.7991452991452992,
"acc_norm_stderr": 0.026246772946890484
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7637292464878672,
"acc_stderr": 0.015190473717037498,
"acc_norm": 0.7637292464878672,
"acc_norm_stderr": 0.015190473717037498
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6271676300578035,
"acc_stderr": 0.02603389061357628,
"acc_norm": 0.6271676300578035,
"acc_norm_stderr": 0.02603389061357628
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3664804469273743,
"acc_stderr": 0.016115235504865464,
"acc_norm": 0.3664804469273743,
"acc_norm_stderr": 0.016115235504865464
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.027914055510468008,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.027914055510468008
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6527331189710611,
"acc_stderr": 0.027040745502307336,
"acc_norm": 0.6527331189710611,
"acc_norm_stderr": 0.027040745502307336
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.02672586880910079,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.02672586880910079
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4219858156028369,
"acc_stderr": 0.029462189233370597,
"acc_norm": 0.4219858156028369,
"acc_norm_stderr": 0.029462189233370597
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.42698826597131684,
"acc_stderr": 0.012633353557534421,
"acc_norm": 0.42698826597131684,
"acc_norm_stderr": 0.012633353557534421
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5367647058823529,
"acc_stderr": 0.03029061918048569,
"acc_norm": 0.5367647058823529,
"acc_norm_stderr": 0.03029061918048569
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5751633986928104,
"acc_stderr": 0.019997973035458333,
"acc_norm": 0.5751633986928104,
"acc_norm_stderr": 0.019997973035458333
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.046534298079135075,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.046534298079135075
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6448979591836734,
"acc_stderr": 0.030635655150387638,
"acc_norm": 0.6448979591836734,
"acc_norm_stderr": 0.030635655150387638
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6169154228855721,
"acc_stderr": 0.03437519337338252,
"acc_norm": 0.6169154228855721,
"acc_norm_stderr": 0.03437519337338252
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4457831325301205,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.4457831325301205,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7719298245614035,
"acc_stderr": 0.032180937956023566,
"acc_norm": 0.7719298245614035,
"acc_norm_stderr": 0.032180937956023566
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3806609547123623,
"mc1_stderr": 0.016997627871907926,
"mc2": 0.5180548254049244,
"mc2_stderr": 0.015711587821782075
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
satoblack/asd | 2023-10-08T17:46:41.000Z | [
"region:us"
] | satoblack | null | null | null | 0 | 0 | Entry not found |
fudong03/Tiny-CropNet | 2023-10-10T06:53:58.000Z | [
"region:us"
] | fudong03 | null | null | null | 0 | 0 | Entry not found |
SojeongKim/nlp_study | 2023-10-08T17:53:03.000Z | [
"region:us"
] | SojeongKim | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_LeoLM__leo-hessianai-13b | 2023-10-08T18:00:53.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of LeoLM/leo-hessianai-13b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [LeoLM/leo-hessianai-13b](https://huggingface.co/LeoLM/leo-hessianai-13b) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_LeoLM__leo-hessianai-13b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-08T17:59:31.182651](https://huggingface.co/datasets/open-llm-leaderboard/details_LeoLM__leo-hessianai-13b/blob/main/results_2023-10-08T17-59-31.182651.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5376477853539503,\n\
\ \"acc_stderr\": 0.03476442164432789,\n \"acc_norm\": 0.5418757587616869,\n\
\ \"acc_norm_stderr\": 0.034744925568465125,\n \"mc1\": 0.2582619339045288,\n\
\ \"mc1_stderr\": 0.0153218216884762,\n \"mc2\": 0.3803151079694499,\n\
\ \"mc2_stderr\": 0.013623001978748491\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5290102389078498,\n \"acc_stderr\": 0.014586776355294326,\n\
\ \"acc_norm\": 0.5725255972696246,\n \"acc_norm_stderr\": 0.01445686294465065\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6134236207926708,\n\
\ \"acc_stderr\": 0.004859699562451457,\n \"acc_norm\": 0.8193586934873531,\n\
\ \"acc_norm_stderr\": 0.0038393444971919536\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421296,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421296\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4666666666666667,\n\
\ \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.4666666666666667,\n\
\ \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5526315789473685,\n \"acc_stderr\": 0.040463368839782514,\n\
\ \"acc_norm\": 0.5526315789473685,\n \"acc_norm_stderr\": 0.040463368839782514\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n\
\ \"acc_stderr\": 0.05021167315686779,\n \"acc_norm\": 0.52,\n \
\ \"acc_norm_stderr\": 0.05021167315686779\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5622641509433962,\n \"acc_stderr\": 0.030533338430467516,\n\
\ \"acc_norm\": 0.5622641509433962,\n \"acc_norm_stderr\": 0.030533338430467516\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5486111111111112,\n\
\ \"acc_stderr\": 0.04161402398403279,\n \"acc_norm\": 0.5486111111111112,\n\
\ \"acc_norm_stderr\": 0.04161402398403279\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5260115606936416,\n\
\ \"acc_stderr\": 0.038073017265045125,\n \"acc_norm\": 0.5260115606936416,\n\
\ \"acc_norm_stderr\": 0.038073017265045125\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006717,\n\
\ \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006717\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n\
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4297872340425532,\n \"acc_stderr\": 0.03236214467715564,\n\
\ \"acc_norm\": 0.4297872340425532,\n \"acc_norm_stderr\": 0.03236214467715564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.042663394431593935,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.042663394431593935\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.45517241379310347,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.45517241379310347,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3201058201058201,\n \"acc_stderr\": 0.024026846392873502,\n \"\
acc_norm\": 0.3201058201058201,\n \"acc_norm_stderr\": 0.024026846392873502\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04285714285714281,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04285714285714281\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6129032258064516,\n\
\ \"acc_stderr\": 0.02770935967503249,\n \"acc_norm\": 0.6129032258064516,\n\
\ \"acc_norm_stderr\": 0.02770935967503249\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4630541871921182,\n \"acc_stderr\": 0.035083705204426656,\n\
\ \"acc_norm\": 0.4630541871921182,\n \"acc_norm_stderr\": 0.035083705204426656\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\"\
: 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.037563357751878974,\n\
\ \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.037563357751878974\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6868686868686869,\n \"acc_stderr\": 0.033042050878136525,\n \"\
acc_norm\": 0.6868686868686869,\n \"acc_norm_stderr\": 0.033042050878136525\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7979274611398963,\n \"acc_stderr\": 0.02897908979429673,\n\
\ \"acc_norm\": 0.7979274611398963,\n \"acc_norm_stderr\": 0.02897908979429673\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.48717948717948717,\n \"acc_stderr\": 0.025342671293807264,\n\
\ \"acc_norm\": 0.48717948717948717,\n \"acc_norm_stderr\": 0.025342671293807264\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3037037037037037,\n \"acc_stderr\": 0.02803792996911499,\n \
\ \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.02803792996911499\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5336134453781513,\n \"acc_stderr\": 0.03240501447690071,\n \
\ \"acc_norm\": 0.5336134453781513,\n \"acc_norm_stderr\": 0.03240501447690071\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7174311926605504,\n \"acc_stderr\": 0.019304243497707152,\n \"\
acc_norm\": 0.7174311926605504,\n \"acc_norm_stderr\": 0.019304243497707152\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.48148148148148145,\n \"acc_stderr\": 0.03407632093854053,\n \"\
acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.03407632093854053\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6911764705882353,\n \"acc_stderr\": 0.03242661719827218,\n \"\
acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.03242661719827218\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7215189873417721,\n \"acc_stderr\": 0.029178682304842544,\n \
\ \"acc_norm\": 0.7215189873417721,\n \"acc_norm_stderr\": 0.029178682304842544\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6412556053811659,\n\
\ \"acc_stderr\": 0.03219079200419995,\n \"acc_norm\": 0.6412556053811659,\n\
\ \"acc_norm_stderr\": 0.03219079200419995\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6106870229007634,\n \"acc_stderr\": 0.04276486542814591,\n\
\ \"acc_norm\": 0.6106870229007634,\n \"acc_norm_stderr\": 0.04276486542814591\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7272727272727273,\n \"acc_stderr\": 0.04065578140908705,\n \"\
acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04065578140908705\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6851851851851852,\n\
\ \"acc_stderr\": 0.04489931073591312,\n \"acc_norm\": 0.6851851851851852,\n\
\ \"acc_norm_stderr\": 0.04489931073591312\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.656441717791411,\n \"acc_stderr\": 0.037311335196738925,\n\
\ \"acc_norm\": 0.656441717791411,\n \"acc_norm_stderr\": 0.037311335196738925\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.32142857142857145,\n\
\ \"acc_stderr\": 0.044328040552915185,\n \"acc_norm\": 0.32142857142857145,\n\
\ \"acc_norm_stderr\": 0.044328040552915185\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6796116504854369,\n \"acc_stderr\": 0.04620284082280041,\n\
\ \"acc_norm\": 0.6796116504854369,\n \"acc_norm_stderr\": 0.04620284082280041\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7521367521367521,\n\
\ \"acc_stderr\": 0.02828632407556438,\n \"acc_norm\": 0.7521367521367521,\n\
\ \"acc_norm_stderr\": 0.02828632407556438\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7381864623243933,\n\
\ \"acc_stderr\": 0.01572083867844527,\n \"acc_norm\": 0.7381864623243933,\n\
\ \"acc_norm_stderr\": 0.01572083867844527\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5780346820809249,\n \"acc_stderr\": 0.02658923114217426,\n\
\ \"acc_norm\": 0.5780346820809249,\n \"acc_norm_stderr\": 0.02658923114217426\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.28938547486033517,\n\
\ \"acc_stderr\": 0.0151665445504903,\n \"acc_norm\": 0.28938547486033517,\n\
\ \"acc_norm_stderr\": 0.0151665445504903\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5947712418300654,\n \"acc_stderr\": 0.02811092849280907,\n\
\ \"acc_norm\": 0.5947712418300654,\n \"acc_norm_stderr\": 0.02811092849280907\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6077170418006431,\n\
\ \"acc_stderr\": 0.027731258647011998,\n \"acc_norm\": 0.6077170418006431,\n\
\ \"acc_norm_stderr\": 0.027731258647011998\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5987654320987654,\n \"acc_stderr\": 0.027272582849839796,\n\
\ \"acc_norm\": 0.5987654320987654,\n \"acc_norm_stderr\": 0.027272582849839796\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.38652482269503546,\n \"acc_stderr\": 0.029049190342543458,\n \
\ \"acc_norm\": 0.38652482269503546,\n \"acc_norm_stderr\": 0.029049190342543458\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.40221642764015647,\n\
\ \"acc_stderr\": 0.012523646856180178,\n \"acc_norm\": 0.40221642764015647,\n\
\ \"acc_norm_stderr\": 0.012523646856180178\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5441176470588235,\n \"acc_stderr\": 0.030254372573976715,\n\
\ \"acc_norm\": 0.5441176470588235,\n \"acc_norm_stderr\": 0.030254372573976715\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5310457516339869,\n \"acc_stderr\": 0.02018880445636189,\n \
\ \"acc_norm\": 0.5310457516339869,\n \"acc_norm_stderr\": 0.02018880445636189\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5909090909090909,\n\
\ \"acc_stderr\": 0.04709306978661895,\n \"acc_norm\": 0.5909090909090909,\n\
\ \"acc_norm_stderr\": 0.04709306978661895\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6448979591836734,\n \"acc_stderr\": 0.030635655150387638,\n\
\ \"acc_norm\": 0.6448979591836734,\n \"acc_norm_stderr\": 0.030635655150387638\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7313432835820896,\n\
\ \"acc_stderr\": 0.03134328358208954,\n \"acc_norm\": 0.7313432835820896,\n\
\ \"acc_norm_stderr\": 0.03134328358208954\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42771084337349397,\n\
\ \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.42771084337349397,\n\
\ \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7368421052631579,\n \"acc_stderr\": 0.03377310252209204,\n\
\ \"acc_norm\": 0.7368421052631579,\n \"acc_norm_stderr\": 0.03377310252209204\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2582619339045288,\n\
\ \"mc1_stderr\": 0.0153218216884762,\n \"mc2\": 0.3803151079694499,\n\
\ \"mc2_stderr\": 0.013623001978748491\n }\n}\n```"
repo_url: https://huggingface.co/LeoLM/leo-hessianai-13b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_08T17_59_31.182651
path:
- '**/details_harness|arc:challenge|25_2023-10-08T17-59-31.182651.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-08T17-59-31.182651.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_08T17_59_31.182651
path:
- '**/details_harness|hellaswag|10_2023-10-08T17-59-31.182651.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-08T17-59-31.182651.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_08T17_59_31.182651
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T17-59-31.182651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T17-59-31.182651.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T17-59-31.182651.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_08T17_59_31.182651
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T17-59-31.182651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T17-59-31.182651.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_08T17_59_31.182651
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T17-59-31.182651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T17-59-31.182651.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_08T17_59_31.182651
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T17-59-31.182651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T17-59-31.182651.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_08T17_59_31.182651
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T17-59-31.182651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T17-59-31.182651.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_08T17_59_31.182651
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T17-59-31.182651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T17-59-31.182651.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_08T17_59_31.182651
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T17-59-31.182651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T17-59-31.182651.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_08T17_59_31.182651
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T17-59-31.182651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T17-59-31.182651.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_08T17_59_31.182651
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T17-59-31.182651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T17-59-31.182651.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_08T17_59_31.182651
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T17-59-31.182651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T17-59-31.182651.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_08T17_59_31.182651
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T17-59-31.182651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T17-59-31.182651.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_08T17_59_31.182651
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T17-59-31.182651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T17-59-31.182651.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_08T17_59_31.182651
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T17-59-31.182651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T17-59-31.182651.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_08T17_59_31.182651
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T17-59-31.182651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T17-59-31.182651.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_08T17_59_31.182651
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T17-59-31.182651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T17-59-31.182651.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_08T17_59_31.182651
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T17-59-31.182651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T17-59-31.182651.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_08T17_59_31.182651
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T17-59-31.182651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T17-59-31.182651.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_08T17_59_31.182651
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T17-59-31.182651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T17-59-31.182651.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_08T17_59_31.182651
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T17-59-31.182651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T17-59-31.182651.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_08T17_59_31.182651
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T17-59-31.182651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T17-59-31.182651.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_08T17_59_31.182651
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T17-59-31.182651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T17-59-31.182651.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_08T17_59_31.182651
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T17-59-31.182651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T17-59-31.182651.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_08T17_59_31.182651
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T17-59-31.182651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T17-59-31.182651.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_08T17_59_31.182651
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T17-59-31.182651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T17-59-31.182651.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_08T17_59_31.182651
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T17-59-31.182651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T17-59-31.182651.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_08T17_59_31.182651
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T17-59-31.182651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T17-59-31.182651.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_08T17_59_31.182651
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T17-59-31.182651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T17-59-31.182651.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_08T17_59_31.182651
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T17-59-31.182651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T17-59-31.182651.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_08T17_59_31.182651
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T17-59-31.182651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T17-59-31.182651.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_08T17_59_31.182651
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T17-59-31.182651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T17-59-31.182651.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_08T17_59_31.182651
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T17-59-31.182651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T17-59-31.182651.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_08T17_59_31.182651
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T17-59-31.182651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T17-59-31.182651.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_08T17_59_31.182651
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T17-59-31.182651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T17-59-31.182651.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_08T17_59_31.182651
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T17-59-31.182651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T17-59-31.182651.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_08T17_59_31.182651
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T17-59-31.182651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T17-59-31.182651.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_08T17_59_31.182651
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T17-59-31.182651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T17-59-31.182651.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_08T17_59_31.182651
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T17-59-31.182651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T17-59-31.182651.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_08T17_59_31.182651
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T17-59-31.182651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T17-59-31.182651.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_08T17_59_31.182651
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T17-59-31.182651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T17-59-31.182651.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_08T17_59_31.182651
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-08T17-59-31.182651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-08T17-59-31.182651.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_08T17_59_31.182651
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T17-59-31.182651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T17-59-31.182651.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_08T17_59_31.182651
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T17-59-31.182651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T17-59-31.182651.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_08T17_59_31.182651
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T17-59-31.182651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T17-59-31.182651.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_08T17_59_31.182651
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T17-59-31.182651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T17-59-31.182651.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_08T17_59_31.182651
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T17-59-31.182651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T17-59-31.182651.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_08T17_59_31.182651
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T17-59-31.182651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T17-59-31.182651.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_08T17_59_31.182651
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T17-59-31.182651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T17-59-31.182651.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_08T17_59_31.182651
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T17-59-31.182651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T17-59-31.182651.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_08T17_59_31.182651
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T17-59-31.182651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T17-59-31.182651.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_08T17_59_31.182651
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T17-59-31.182651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T17-59-31.182651.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_08T17_59_31.182651
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T17-59-31.182651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T17-59-31.182651.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_08T17_59_31.182651
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T17-59-31.182651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T17-59-31.182651.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_08T17_59_31.182651
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T17-59-31.182651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T17-59-31.182651.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_08T17_59_31.182651
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T17-59-31.182651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T17-59-31.182651.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_08T17_59_31.182651
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T17-59-31.182651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T17-59-31.182651.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_08T17_59_31.182651
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T17-59-31.182651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T17-59-31.182651.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_08T17_59_31.182651
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T17-59-31.182651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T17-59-31.182651.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_08T17_59_31.182651
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T17-59-31.182651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T17-59-31.182651.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_08T17_59_31.182651
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-08T17-59-31.182651.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-08T17-59-31.182651.parquet'
- config_name: results
data_files:
- split: 2023_10_08T17_59_31.182651
path:
- results_2023-10-08T17-59-31.182651.parquet
- split: latest
path:
- results_2023-10-08T17-59-31.182651.parquet
---
# Dataset Card for Evaluation run of LeoLM/leo-hessianai-13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/LeoLM/leo-hessianai-13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [LeoLM/leo-hessianai-13b](https://huggingface.co/LeoLM/leo-hessianai-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_LeoLM__leo-hessianai-13b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-08T17:59:31.182651](https://huggingface.co/datasets/open-llm-leaderboard/details_LeoLM__leo-hessianai-13b/blob/main/results_2023-10-08T17-59-31.182651.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5376477853539503,
"acc_stderr": 0.03476442164432789,
"acc_norm": 0.5418757587616869,
"acc_norm_stderr": 0.034744925568465125,
"mc1": 0.2582619339045288,
"mc1_stderr": 0.0153218216884762,
"mc2": 0.3803151079694499,
"mc2_stderr": 0.013623001978748491
},
"harness|arc:challenge|25": {
"acc": 0.5290102389078498,
"acc_stderr": 0.014586776355294326,
"acc_norm": 0.5725255972696246,
"acc_norm_stderr": 0.01445686294465065
},
"harness|hellaswag|10": {
"acc": 0.6134236207926708,
"acc_stderr": 0.004859699562451457,
"acc_norm": 0.8193586934873531,
"acc_norm_stderr": 0.0038393444971919536
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421296,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421296
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5526315789473685,
"acc_stderr": 0.040463368839782514,
"acc_norm": 0.5526315789473685,
"acc_norm_stderr": 0.040463368839782514
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.05021167315686779,
"acc_norm": 0.52,
"acc_norm_stderr": 0.05021167315686779
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5622641509433962,
"acc_stderr": 0.030533338430467516,
"acc_norm": 0.5622641509433962,
"acc_norm_stderr": 0.030533338430467516
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5486111111111112,
"acc_stderr": 0.04161402398403279,
"acc_norm": 0.5486111111111112,
"acc_norm_stderr": 0.04161402398403279
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5260115606936416,
"acc_stderr": 0.038073017265045125,
"acc_norm": 0.5260115606936416,
"acc_norm_stderr": 0.038073017265045125
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.04617034827006717,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.04617034827006717
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4297872340425532,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.4297872340425532,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.042663394431593935,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.042663394431593935
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.45517241379310347,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.45517241379310347,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3201058201058201,
"acc_stderr": 0.024026846392873502,
"acc_norm": 0.3201058201058201,
"acc_norm_stderr": 0.024026846392873502
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04285714285714281,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04285714285714281
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6129032258064516,
"acc_stderr": 0.02770935967503249,
"acc_norm": 0.6129032258064516,
"acc_norm_stderr": 0.02770935967503249
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4630541871921182,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.4630541871921182,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.037563357751878974,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.037563357751878974
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6868686868686869,
"acc_stderr": 0.033042050878136525,
"acc_norm": 0.6868686868686869,
"acc_norm_stderr": 0.033042050878136525
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7979274611398963,
"acc_stderr": 0.02897908979429673,
"acc_norm": 0.7979274611398963,
"acc_norm_stderr": 0.02897908979429673
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.48717948717948717,
"acc_stderr": 0.025342671293807264,
"acc_norm": 0.48717948717948717,
"acc_norm_stderr": 0.025342671293807264
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.02803792996911499,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.02803792996911499
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5336134453781513,
"acc_stderr": 0.03240501447690071,
"acc_norm": 0.5336134453781513,
"acc_norm_stderr": 0.03240501447690071
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.0395802723112157,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.0395802723112157
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7174311926605504,
"acc_stderr": 0.019304243497707152,
"acc_norm": 0.7174311926605504,
"acc_norm_stderr": 0.019304243497707152
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.03407632093854053,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.03407632093854053
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6911764705882353,
"acc_stderr": 0.03242661719827218,
"acc_norm": 0.6911764705882353,
"acc_norm_stderr": 0.03242661719827218
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7215189873417721,
"acc_stderr": 0.029178682304842544,
"acc_norm": 0.7215189873417721,
"acc_norm_stderr": 0.029178682304842544
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6412556053811659,
"acc_stderr": 0.03219079200419995,
"acc_norm": 0.6412556053811659,
"acc_norm_stderr": 0.03219079200419995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6106870229007634,
"acc_stderr": 0.04276486542814591,
"acc_norm": 0.6106870229007634,
"acc_norm_stderr": 0.04276486542814591
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04065578140908705,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04065578140908705
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.04489931073591312,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.04489931073591312
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.656441717791411,
"acc_stderr": 0.037311335196738925,
"acc_norm": 0.656441717791411,
"acc_norm_stderr": 0.037311335196738925
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.32142857142857145,
"acc_stderr": 0.044328040552915185,
"acc_norm": 0.32142857142857145,
"acc_norm_stderr": 0.044328040552915185
},
"harness|hendrycksTest-management|5": {
"acc": 0.6796116504854369,
"acc_stderr": 0.04620284082280041,
"acc_norm": 0.6796116504854369,
"acc_norm_stderr": 0.04620284082280041
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7521367521367521,
"acc_stderr": 0.02828632407556438,
"acc_norm": 0.7521367521367521,
"acc_norm_stderr": 0.02828632407556438
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7381864623243933,
"acc_stderr": 0.01572083867844527,
"acc_norm": 0.7381864623243933,
"acc_norm_stderr": 0.01572083867844527
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5780346820809249,
"acc_stderr": 0.02658923114217426,
"acc_norm": 0.5780346820809249,
"acc_norm_stderr": 0.02658923114217426
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.28938547486033517,
"acc_stderr": 0.0151665445504903,
"acc_norm": 0.28938547486033517,
"acc_norm_stderr": 0.0151665445504903
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5947712418300654,
"acc_stderr": 0.02811092849280907,
"acc_norm": 0.5947712418300654,
"acc_norm_stderr": 0.02811092849280907
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6077170418006431,
"acc_stderr": 0.027731258647011998,
"acc_norm": 0.6077170418006431,
"acc_norm_stderr": 0.027731258647011998
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5987654320987654,
"acc_stderr": 0.027272582849839796,
"acc_norm": 0.5987654320987654,
"acc_norm_stderr": 0.027272582849839796
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.38652482269503546,
"acc_stderr": 0.029049190342543458,
"acc_norm": 0.38652482269503546,
"acc_norm_stderr": 0.029049190342543458
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.40221642764015647,
"acc_stderr": 0.012523646856180178,
"acc_norm": 0.40221642764015647,
"acc_norm_stderr": 0.012523646856180178
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5441176470588235,
"acc_stderr": 0.030254372573976715,
"acc_norm": 0.5441176470588235,
"acc_norm_stderr": 0.030254372573976715
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5310457516339869,
"acc_stderr": 0.02018880445636189,
"acc_norm": 0.5310457516339869,
"acc_norm_stderr": 0.02018880445636189
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5909090909090909,
"acc_stderr": 0.04709306978661895,
"acc_norm": 0.5909090909090909,
"acc_norm_stderr": 0.04709306978661895
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6448979591836734,
"acc_stderr": 0.030635655150387638,
"acc_norm": 0.6448979591836734,
"acc_norm_stderr": 0.030635655150387638
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7313432835820896,
"acc_stderr": 0.03134328358208954,
"acc_norm": 0.7313432835820896,
"acc_norm_stderr": 0.03134328358208954
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42771084337349397,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.42771084337349397,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7368421052631579,
"acc_stderr": 0.03377310252209204,
"acc_norm": 0.7368421052631579,
"acc_norm_stderr": 0.03377310252209204
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2582619339045288,
"mc1_stderr": 0.0153218216884762,
"mc2": 0.3803151079694499,
"mc2_stderr": 0.013623001978748491
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_Lazycuber__L2-7b-Guanaco-Random-Test | 2023-10-08T18:15:11.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Lazycuber/L2-7b-Guanaco-Random-Test
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Lazycuber/L2-7b-Guanaco-Random-Test](https://huggingface.co/Lazycuber/L2-7b-Guanaco-Random-Test)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Lazycuber__L2-7b-Guanaco-Random-Test\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-08T18:13:47.081600](https://huggingface.co/datasets/open-llm-leaderboard/details_Lazycuber__L2-7b-Guanaco-Random-Test/blob/main/results_2023-10-08T18-13-47.081600.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.47820349788584665,\n\
\ \"acc_stderr\": 0.03520803674350638,\n \"acc_norm\": 0.4820937504834085,\n\
\ \"acc_norm_stderr\": 0.03519557788566828,\n \"mc1\": 0.27906976744186046,\n\
\ \"mc1_stderr\": 0.0157021070906279,\n \"mc2\": 0.4232640996589444,\n\
\ \"mc2_stderr\": 0.01477991946603906\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4761092150170648,\n \"acc_stderr\": 0.014594701798071654,\n\
\ \"acc_norm\": 0.5059726962457338,\n \"acc_norm_stderr\": 0.014610348300255795\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5723959370643298,\n\
\ \"acc_stderr\": 0.004937199759947679,\n \"acc_norm\": 0.7720573590918144,\n\
\ \"acc_norm_stderr\": 0.004186480645315568\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.43703703703703706,\n\
\ \"acc_stderr\": 0.042849586397533994,\n \"acc_norm\": 0.43703703703703706,\n\
\ \"acc_norm_stderr\": 0.042849586397533994\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5131578947368421,\n \"acc_stderr\": 0.04067533136309173,\n\
\ \"acc_norm\": 0.5131578947368421,\n \"acc_norm_stderr\": 0.04067533136309173\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.46,\n\
\ \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5169811320754717,\n \"acc_stderr\": 0.030755120364119905,\n\
\ \"acc_norm\": 0.5169811320754717,\n \"acc_norm_stderr\": 0.030755120364119905\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5138888888888888,\n\
\ \"acc_stderr\": 0.041795966175810016,\n \"acc_norm\": 0.5138888888888888,\n\
\ \"acc_norm_stderr\": 0.041795966175810016\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n\
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3699421965317919,\n\
\ \"acc_stderr\": 0.036812296333943194,\n \"acc_norm\": 0.3699421965317919,\n\
\ \"acc_norm_stderr\": 0.036812296333943194\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.042801058373643966,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.042801058373643966\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n\
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.425531914893617,\n \"acc_stderr\": 0.03232146916224469,\n\
\ \"acc_norm\": 0.425531914893617,\n \"acc_norm_stderr\": 0.03232146916224469\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.38596491228070173,\n\
\ \"acc_stderr\": 0.04579639422070434,\n \"acc_norm\": 0.38596491228070173,\n\
\ \"acc_norm_stderr\": 0.04579639422070434\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.503448275862069,\n \"acc_stderr\": 0.041665675771015785,\n\
\ \"acc_norm\": 0.503448275862069,\n \"acc_norm_stderr\": 0.041665675771015785\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.31216931216931215,\n \"acc_stderr\": 0.0238652068369726,\n \"\
acc_norm\": 0.31216931216931215,\n \"acc_norm_stderr\": 0.0238652068369726\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.24603174603174602,\n\
\ \"acc_stderr\": 0.038522733649243156,\n \"acc_norm\": 0.24603174603174602,\n\
\ \"acc_norm_stderr\": 0.038522733649243156\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5290322580645161,\n\
\ \"acc_stderr\": 0.028396016402761005,\n \"acc_norm\": 0.5290322580645161,\n\
\ \"acc_norm_stderr\": 0.028396016402761005\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3793103448275862,\n \"acc_stderr\": 0.03413963805906235,\n\
\ \"acc_norm\": 0.3793103448275862,\n \"acc_norm_stderr\": 0.03413963805906235\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\"\
: 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5818181818181818,\n \"acc_stderr\": 0.03851716319398395,\n\
\ \"acc_norm\": 0.5818181818181818,\n \"acc_norm_stderr\": 0.03851716319398395\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5909090909090909,\n \"acc_stderr\": 0.03502975799413007,\n \"\
acc_norm\": 0.5909090909090909,\n \"acc_norm_stderr\": 0.03502975799413007\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6683937823834197,\n \"acc_stderr\": 0.03397636541089118,\n\
\ \"acc_norm\": 0.6683937823834197,\n \"acc_norm_stderr\": 0.03397636541089118\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4128205128205128,\n \"acc_stderr\": 0.024962683564331803,\n\
\ \"acc_norm\": 0.4128205128205128,\n \"acc_norm_stderr\": 0.024962683564331803\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085626,\n \
\ \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085626\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3907563025210084,\n \"acc_stderr\": 0.031693802357129965,\n\
\ \"acc_norm\": 0.3907563025210084,\n \"acc_norm_stderr\": 0.031693802357129965\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.03802039760107903,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.03802039760107903\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6642201834862386,\n \"acc_stderr\": 0.020248081396752927,\n \"\
acc_norm\": 0.6642201834862386,\n \"acc_norm_stderr\": 0.020248081396752927\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2962962962962963,\n \"acc_stderr\": 0.031141447823536016,\n \"\
acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.031141447823536016\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6421568627450981,\n \"acc_stderr\": 0.03364487286088298,\n \"\
acc_norm\": 0.6421568627450981,\n \"acc_norm_stderr\": 0.03364487286088298\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6286919831223629,\n \"acc_stderr\": 0.0314506860074486,\n \
\ \"acc_norm\": 0.6286919831223629,\n \"acc_norm_stderr\": 0.0314506860074486\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5560538116591929,\n\
\ \"acc_stderr\": 0.03334625674242728,\n \"acc_norm\": 0.5560538116591929,\n\
\ \"acc_norm_stderr\": 0.03334625674242728\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5572519083969466,\n \"acc_stderr\": 0.04356447202665069,\n\
\ \"acc_norm\": 0.5572519083969466,\n \"acc_norm_stderr\": 0.04356447202665069\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6694214876033058,\n \"acc_stderr\": 0.04294340845212093,\n \"\
acc_norm\": 0.6694214876033058,\n \"acc_norm_stderr\": 0.04294340845212093\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.04750077341199984,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.04750077341199984\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5153374233128835,\n \"acc_stderr\": 0.03926522378708843,\n\
\ \"acc_norm\": 0.5153374233128835,\n \"acc_norm_stderr\": 0.03926522378708843\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04547960999764376,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04547960999764376\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6407766990291263,\n \"acc_stderr\": 0.047504583990416946,\n\
\ \"acc_norm\": 0.6407766990291263,\n \"acc_norm_stderr\": 0.047504583990416946\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7307692307692307,\n\
\ \"acc_stderr\": 0.029058588303748842,\n \"acc_norm\": 0.7307692307692307,\n\
\ \"acc_norm_stderr\": 0.029058588303748842\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956913,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956913\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6845466155810983,\n\
\ \"acc_stderr\": 0.016617501738763387,\n \"acc_norm\": 0.6845466155810983,\n\
\ \"acc_norm_stderr\": 0.016617501738763387\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5260115606936416,\n \"acc_stderr\": 0.02688264343402289,\n\
\ \"acc_norm\": 0.5260115606936416,\n \"acc_norm_stderr\": 0.02688264343402289\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.22681564245810057,\n\
\ \"acc_stderr\": 0.014005843570897895,\n \"acc_norm\": 0.22681564245810057,\n\
\ \"acc_norm_stderr\": 0.014005843570897895\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5392156862745098,\n \"acc_stderr\": 0.028541722692618874,\n\
\ \"acc_norm\": 0.5392156862745098,\n \"acc_norm_stderr\": 0.028541722692618874\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5466237942122186,\n\
\ \"acc_stderr\": 0.02827435985489426,\n \"acc_norm\": 0.5466237942122186,\n\
\ \"acc_norm_stderr\": 0.02827435985489426\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.558641975308642,\n \"acc_stderr\": 0.027628737155668763,\n\
\ \"acc_norm\": 0.558641975308642,\n \"acc_norm_stderr\": 0.027628737155668763\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3617021276595745,\n \"acc_stderr\": 0.028663820147199495,\n \
\ \"acc_norm\": 0.3617021276595745,\n \"acc_norm_stderr\": 0.028663820147199495\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.32790091264667537,\n\
\ \"acc_stderr\": 0.011989936640666525,\n \"acc_norm\": 0.32790091264667537,\n\
\ \"acc_norm_stderr\": 0.011989936640666525\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.39705882352941174,\n \"acc_stderr\": 0.029722152099280065,\n\
\ \"acc_norm\": 0.39705882352941174,\n \"acc_norm_stderr\": 0.029722152099280065\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.46895424836601307,\n \"acc_stderr\": 0.020188804456361883,\n \
\ \"acc_norm\": 0.46895424836601307,\n \"acc_norm_stderr\": 0.020188804456361883\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.509090909090909,\n\
\ \"acc_stderr\": 0.0478833976870286,\n \"acc_norm\": 0.509090909090909,\n\
\ \"acc_norm_stderr\": 0.0478833976870286\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5224489795918368,\n \"acc_stderr\": 0.03197694118713672,\n\
\ \"acc_norm\": 0.5224489795918368,\n \"acc_norm_stderr\": 0.03197694118713672\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6218905472636815,\n\
\ \"acc_stderr\": 0.034288678487786564,\n \"acc_norm\": 0.6218905472636815,\n\
\ \"acc_norm_stderr\": 0.034288678487786564\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \
\ \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.43373493975903615,\n\
\ \"acc_stderr\": 0.03858158940685517,\n \"acc_norm\": 0.43373493975903615,\n\
\ \"acc_norm_stderr\": 0.03858158940685517\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.695906432748538,\n \"acc_stderr\": 0.0352821125824523,\n\
\ \"acc_norm\": 0.695906432748538,\n \"acc_norm_stderr\": 0.0352821125824523\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.27906976744186046,\n\
\ \"mc1_stderr\": 0.0157021070906279,\n \"mc2\": 0.4232640996589444,\n\
\ \"mc2_stderr\": 0.01477991946603906\n }\n}\n```"
repo_url: https://huggingface.co/Lazycuber/L2-7b-Guanaco-Random-Test
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_08T18_13_47.081600
path:
- '**/details_harness|arc:challenge|25_2023-10-08T18-13-47.081600.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-08T18-13-47.081600.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_08T18_13_47.081600
path:
- '**/details_harness|hellaswag|10_2023-10-08T18-13-47.081600.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-08T18-13-47.081600.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_08T18_13_47.081600
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T18-13-47.081600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T18-13-47.081600.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T18-13-47.081600.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_08T18_13_47.081600
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T18-13-47.081600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T18-13-47.081600.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_08T18_13_47.081600
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T18-13-47.081600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T18-13-47.081600.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_08T18_13_47.081600
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T18-13-47.081600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T18-13-47.081600.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_08T18_13_47.081600
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T18-13-47.081600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T18-13-47.081600.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_08T18_13_47.081600
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T18-13-47.081600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T18-13-47.081600.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_08T18_13_47.081600
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T18-13-47.081600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T18-13-47.081600.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_08T18_13_47.081600
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T18-13-47.081600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T18-13-47.081600.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_08T18_13_47.081600
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T18-13-47.081600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T18-13-47.081600.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_08T18_13_47.081600
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T18-13-47.081600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T18-13-47.081600.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_08T18_13_47.081600
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T18-13-47.081600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T18-13-47.081600.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_08T18_13_47.081600
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T18-13-47.081600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T18-13-47.081600.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_08T18_13_47.081600
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T18-13-47.081600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T18-13-47.081600.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_08T18_13_47.081600
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T18-13-47.081600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T18-13-47.081600.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_08T18_13_47.081600
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T18-13-47.081600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T18-13-47.081600.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_08T18_13_47.081600
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T18-13-47.081600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T18-13-47.081600.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_08T18_13_47.081600
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T18-13-47.081600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T18-13-47.081600.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_08T18_13_47.081600
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T18-13-47.081600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T18-13-47.081600.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_08T18_13_47.081600
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T18-13-47.081600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T18-13-47.081600.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_08T18_13_47.081600
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T18-13-47.081600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T18-13-47.081600.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_08T18_13_47.081600
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T18-13-47.081600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T18-13-47.081600.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_08T18_13_47.081600
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T18-13-47.081600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T18-13-47.081600.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_08T18_13_47.081600
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T18-13-47.081600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T18-13-47.081600.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_08T18_13_47.081600
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T18-13-47.081600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T18-13-47.081600.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_08T18_13_47.081600
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T18-13-47.081600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T18-13-47.081600.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_08T18_13_47.081600
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T18-13-47.081600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T18-13-47.081600.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_08T18_13_47.081600
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T18-13-47.081600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T18-13-47.081600.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_08T18_13_47.081600
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T18-13-47.081600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T18-13-47.081600.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_08T18_13_47.081600
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T18-13-47.081600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T18-13-47.081600.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_08T18_13_47.081600
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T18-13-47.081600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T18-13-47.081600.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_08T18_13_47.081600
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T18-13-47.081600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T18-13-47.081600.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_08T18_13_47.081600
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T18-13-47.081600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T18-13-47.081600.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_08T18_13_47.081600
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T18-13-47.081600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T18-13-47.081600.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_08T18_13_47.081600
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T18-13-47.081600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T18-13-47.081600.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_08T18_13_47.081600
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T18-13-47.081600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T18-13-47.081600.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_08T18_13_47.081600
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T18-13-47.081600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T18-13-47.081600.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_08T18_13_47.081600
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T18-13-47.081600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T18-13-47.081600.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_08T18_13_47.081600
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T18-13-47.081600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T18-13-47.081600.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_08T18_13_47.081600
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T18-13-47.081600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T18-13-47.081600.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_08T18_13_47.081600
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-08T18-13-47.081600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-08T18-13-47.081600.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_08T18_13_47.081600
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T18-13-47.081600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T18-13-47.081600.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_08T18_13_47.081600
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T18-13-47.081600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T18-13-47.081600.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_08T18_13_47.081600
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T18-13-47.081600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T18-13-47.081600.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_08T18_13_47.081600
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T18-13-47.081600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T18-13-47.081600.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_08T18_13_47.081600
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T18-13-47.081600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T18-13-47.081600.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_08T18_13_47.081600
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T18-13-47.081600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T18-13-47.081600.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_08T18_13_47.081600
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T18-13-47.081600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T18-13-47.081600.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_08T18_13_47.081600
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T18-13-47.081600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T18-13-47.081600.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_08T18_13_47.081600
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T18-13-47.081600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T18-13-47.081600.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_08T18_13_47.081600
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T18-13-47.081600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T18-13-47.081600.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_08T18_13_47.081600
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T18-13-47.081600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T18-13-47.081600.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_08T18_13_47.081600
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T18-13-47.081600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T18-13-47.081600.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_08T18_13_47.081600
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T18-13-47.081600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T18-13-47.081600.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_08T18_13_47.081600
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T18-13-47.081600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T18-13-47.081600.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_08T18_13_47.081600
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T18-13-47.081600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T18-13-47.081600.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_08T18_13_47.081600
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T18-13-47.081600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T18-13-47.081600.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_08T18_13_47.081600
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T18-13-47.081600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T18-13-47.081600.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_08T18_13_47.081600
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T18-13-47.081600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T18-13-47.081600.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_08T18_13_47.081600
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-08T18-13-47.081600.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-08T18-13-47.081600.parquet'
- config_name: results
data_files:
- split: 2023_10_08T18_13_47.081600
path:
- results_2023-10-08T18-13-47.081600.parquet
- split: latest
path:
- results_2023-10-08T18-13-47.081600.parquet
---
# Dataset Card for Evaluation run of Lazycuber/L2-7b-Guanaco-Random-Test
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Lazycuber/L2-7b-Guanaco-Random-Test
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Lazycuber/L2-7b-Guanaco-Random-Test](https://huggingface.co/Lazycuber/L2-7b-Guanaco-Random-Test) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Lazycuber__L2-7b-Guanaco-Random-Test",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-08T18:13:47.081600](https://huggingface.co/datasets/open-llm-leaderboard/details_Lazycuber__L2-7b-Guanaco-Random-Test/blob/main/results_2023-10-08T18-13-47.081600.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.47820349788584665,
"acc_stderr": 0.03520803674350638,
"acc_norm": 0.4820937504834085,
"acc_norm_stderr": 0.03519557788566828,
"mc1": 0.27906976744186046,
"mc1_stderr": 0.0157021070906279,
"mc2": 0.4232640996589444,
"mc2_stderr": 0.01477991946603906
},
"harness|arc:challenge|25": {
"acc": 0.4761092150170648,
"acc_stderr": 0.014594701798071654,
"acc_norm": 0.5059726962457338,
"acc_norm_stderr": 0.014610348300255795
},
"harness|hellaswag|10": {
"acc": 0.5723959370643298,
"acc_stderr": 0.004937199759947679,
"acc_norm": 0.7720573590918144,
"acc_norm_stderr": 0.004186480645315568
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.43703703703703706,
"acc_stderr": 0.042849586397533994,
"acc_norm": 0.43703703703703706,
"acc_norm_stderr": 0.042849586397533994
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5131578947368421,
"acc_stderr": 0.04067533136309173,
"acc_norm": 0.5131578947368421,
"acc_norm_stderr": 0.04067533136309173
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5169811320754717,
"acc_stderr": 0.030755120364119905,
"acc_norm": 0.5169811320754717,
"acc_norm_stderr": 0.030755120364119905
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.041795966175810016,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.041795966175810016
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3699421965317919,
"acc_stderr": 0.036812296333943194,
"acc_norm": 0.3699421965317919,
"acc_norm_stderr": 0.036812296333943194
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.042801058373643966,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.042801058373643966
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.425531914893617,
"acc_stderr": 0.03232146916224469,
"acc_norm": 0.425531914893617,
"acc_norm_stderr": 0.03232146916224469
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.38596491228070173,
"acc_stderr": 0.04579639422070434,
"acc_norm": 0.38596491228070173,
"acc_norm_stderr": 0.04579639422070434
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.503448275862069,
"acc_stderr": 0.041665675771015785,
"acc_norm": 0.503448275862069,
"acc_norm_stderr": 0.041665675771015785
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.31216931216931215,
"acc_stderr": 0.0238652068369726,
"acc_norm": 0.31216931216931215,
"acc_norm_stderr": 0.0238652068369726
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.24603174603174602,
"acc_stderr": 0.038522733649243156,
"acc_norm": 0.24603174603174602,
"acc_norm_stderr": 0.038522733649243156
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5290322580645161,
"acc_stderr": 0.028396016402761005,
"acc_norm": 0.5290322580645161,
"acc_norm_stderr": 0.028396016402761005
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3793103448275862,
"acc_stderr": 0.03413963805906235,
"acc_norm": 0.3793103448275862,
"acc_norm_stderr": 0.03413963805906235
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5818181818181818,
"acc_stderr": 0.03851716319398395,
"acc_norm": 0.5818181818181818,
"acc_norm_stderr": 0.03851716319398395
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5909090909090909,
"acc_stderr": 0.03502975799413007,
"acc_norm": 0.5909090909090909,
"acc_norm_stderr": 0.03502975799413007
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6683937823834197,
"acc_stderr": 0.03397636541089118,
"acc_norm": 0.6683937823834197,
"acc_norm_stderr": 0.03397636541089118
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4128205128205128,
"acc_stderr": 0.024962683564331803,
"acc_norm": 0.4128205128205128,
"acc_norm_stderr": 0.024962683564331803
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.027195934804085626,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.027195934804085626
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3907563025210084,
"acc_stderr": 0.031693802357129965,
"acc_norm": 0.3907563025210084,
"acc_norm_stderr": 0.031693802357129965
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.03802039760107903,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.03802039760107903
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6642201834862386,
"acc_stderr": 0.020248081396752927,
"acc_norm": 0.6642201834862386,
"acc_norm_stderr": 0.020248081396752927
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.031141447823536016,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.031141447823536016
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6421568627450981,
"acc_stderr": 0.03364487286088298,
"acc_norm": 0.6421568627450981,
"acc_norm_stderr": 0.03364487286088298
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6286919831223629,
"acc_stderr": 0.0314506860074486,
"acc_norm": 0.6286919831223629,
"acc_norm_stderr": 0.0314506860074486
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5560538116591929,
"acc_stderr": 0.03334625674242728,
"acc_norm": 0.5560538116591929,
"acc_norm_stderr": 0.03334625674242728
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5572519083969466,
"acc_stderr": 0.04356447202665069,
"acc_norm": 0.5572519083969466,
"acc_norm_stderr": 0.04356447202665069
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6694214876033058,
"acc_stderr": 0.04294340845212093,
"acc_norm": 0.6694214876033058,
"acc_norm_stderr": 0.04294340845212093
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04750077341199984,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04750077341199984
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5153374233128835,
"acc_stderr": 0.03926522378708843,
"acc_norm": 0.5153374233128835,
"acc_norm_stderr": 0.03926522378708843
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04547960999764376,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04547960999764376
},
"harness|hendrycksTest-management|5": {
"acc": 0.6407766990291263,
"acc_stderr": 0.047504583990416946,
"acc_norm": 0.6407766990291263,
"acc_norm_stderr": 0.047504583990416946
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7307692307692307,
"acc_stderr": 0.029058588303748842,
"acc_norm": 0.7307692307692307,
"acc_norm_stderr": 0.029058588303748842
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956913,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956913
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6845466155810983,
"acc_stderr": 0.016617501738763387,
"acc_norm": 0.6845466155810983,
"acc_norm_stderr": 0.016617501738763387
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5260115606936416,
"acc_stderr": 0.02688264343402289,
"acc_norm": 0.5260115606936416,
"acc_norm_stderr": 0.02688264343402289
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.22681564245810057,
"acc_stderr": 0.014005843570897895,
"acc_norm": 0.22681564245810057,
"acc_norm_stderr": 0.014005843570897895
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5392156862745098,
"acc_stderr": 0.028541722692618874,
"acc_norm": 0.5392156862745098,
"acc_norm_stderr": 0.028541722692618874
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5466237942122186,
"acc_stderr": 0.02827435985489426,
"acc_norm": 0.5466237942122186,
"acc_norm_stderr": 0.02827435985489426
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.558641975308642,
"acc_stderr": 0.027628737155668763,
"acc_norm": 0.558641975308642,
"acc_norm_stderr": 0.027628737155668763
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3617021276595745,
"acc_stderr": 0.028663820147199495,
"acc_norm": 0.3617021276595745,
"acc_norm_stderr": 0.028663820147199495
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.32790091264667537,
"acc_stderr": 0.011989936640666525,
"acc_norm": 0.32790091264667537,
"acc_norm_stderr": 0.011989936640666525
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.39705882352941174,
"acc_stderr": 0.029722152099280065,
"acc_norm": 0.39705882352941174,
"acc_norm_stderr": 0.029722152099280065
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.46895424836601307,
"acc_stderr": 0.020188804456361883,
"acc_norm": 0.46895424836601307,
"acc_norm_stderr": 0.020188804456361883
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.509090909090909,
"acc_stderr": 0.0478833976870286,
"acc_norm": 0.509090909090909,
"acc_norm_stderr": 0.0478833976870286
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5224489795918368,
"acc_stderr": 0.03197694118713672,
"acc_norm": 0.5224489795918368,
"acc_norm_stderr": 0.03197694118713672
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6218905472636815,
"acc_stderr": 0.034288678487786564,
"acc_norm": 0.6218905472636815,
"acc_norm_stderr": 0.034288678487786564
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-virology|5": {
"acc": 0.43373493975903615,
"acc_stderr": 0.03858158940685517,
"acc_norm": 0.43373493975903615,
"acc_norm_stderr": 0.03858158940685517
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.695906432748538,
"acc_stderr": 0.0352821125824523,
"acc_norm": 0.695906432748538,
"acc_norm_stderr": 0.0352821125824523
},
"harness|truthfulqa:mc|0": {
"mc1": 0.27906976744186046,
"mc1_stderr": 0.0157021070906279,
"mc2": 0.4232640996589444,
"mc2_stderr": 0.01477991946603906
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_jondurbin__airoboros-l2-13b-3.0 | 2023-10-08T18:15:54.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of jondurbin/airoboros-l2-13b-3.0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jondurbin/airoboros-l2-13b-3.0](https://huggingface.co/jondurbin/airoboros-l2-13b-3.0)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jondurbin__airoboros-l2-13b-3.0\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-08T18:14:31.712178](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-l2-13b-3.0/blob/main/results_2023-10-08T18-14-31.712178.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5502881382765671,\n\
\ \"acc_stderr\": 0.034374799430363756,\n \"acc_norm\": 0.5543095430488207,\n\
\ \"acc_norm_stderr\": 0.03435303362752815,\n \"mc1\": 0.34149326805385555,\n\
\ \"mc1_stderr\": 0.016600688619950833,\n \"mc2\": 0.47789419168905956,\n\
\ \"mc2_stderr\": 0.015029215162188553\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5656996587030717,\n \"acc_stderr\": 0.01448470304885736,\n\
\ \"acc_norm\": 0.5981228668941979,\n \"acc_norm_stderr\": 0.014327268614578276\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.632244572794264,\n\
\ \"acc_stderr\": 0.004812088620277185,\n \"acc_norm\": 0.8370842461661023,\n\
\ \"acc_norm_stderr\": 0.003685340687255413\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4962962962962963,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.4962962962962963,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5460526315789473,\n \"acc_stderr\": 0.04051646342874141,\n\
\ \"acc_norm\": 0.5460526315789473,\n \"acc_norm_stderr\": 0.04051646342874141\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5735849056603773,\n \"acc_stderr\": 0.03043779434298305,\n\
\ \"acc_norm\": 0.5735849056603773,\n \"acc_norm_stderr\": 0.03043779434298305\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6041666666666666,\n\
\ \"acc_stderr\": 0.04089465449325582,\n \"acc_norm\": 0.6041666666666666,\n\
\ \"acc_norm_stderr\": 0.04089465449325582\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\"\
: 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5144508670520231,\n\
\ \"acc_stderr\": 0.03810871630454764,\n \"acc_norm\": 0.5144508670520231,\n\
\ \"acc_norm_stderr\": 0.03810871630454764\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171452,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171452\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.04560480215720685,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.04560480215720685\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.03202563076101735,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.03202563076101735\n },\n\
\ \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3157894736842105,\n\
\ \"acc_stderr\": 0.043727482902780064,\n \"acc_norm\": 0.3157894736842105,\n\
\ \"acc_norm_stderr\": 0.043727482902780064\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728762,\n\
\ \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728762\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3253968253968254,\n \"acc_stderr\": 0.024130158299762613,\n \"\
acc_norm\": 0.3253968253968254,\n \"acc_norm_stderr\": 0.024130158299762613\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.0404061017820884,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.0404061017820884\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6225806451612903,\n \"acc_stderr\": 0.027575960723278233,\n \"\
acc_norm\": 0.6225806451612903,\n \"acc_norm_stderr\": 0.027575960723278233\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.3793103448275862,\n \"acc_stderr\": 0.034139638059062345,\n \"\
acc_norm\": 0.3793103448275862,\n \"acc_norm_stderr\": 0.034139638059062345\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\"\
: 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6424242424242425,\n \"acc_stderr\": 0.037425970438065864,\n\
\ \"acc_norm\": 0.6424242424242425,\n \"acc_norm_stderr\": 0.037425970438065864\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6818181818181818,\n \"acc_stderr\": 0.0331847733384533,\n \"acc_norm\"\
: 0.6818181818181818,\n \"acc_norm_stderr\": 0.0331847733384533\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.8082901554404145,\n \"acc_stderr\": 0.02840895362624526,\n\
\ \"acc_norm\": 0.8082901554404145,\n \"acc_norm_stderr\": 0.02840895362624526\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.49743589743589745,\n \"acc_stderr\": 0.025350672979412195,\n\
\ \"acc_norm\": 0.49743589743589745,\n \"acc_norm_stderr\": 0.025350672979412195\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25925925925925924,\n \"acc_stderr\": 0.026719240783712166,\n \
\ \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.026719240783712166\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5126050420168067,\n \"acc_stderr\": 0.03246816765752174,\n \
\ \"acc_norm\": 0.5126050420168067,\n \"acc_norm_stderr\": 0.03246816765752174\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658753,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658753\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7174311926605504,\n \"acc_stderr\": 0.01930424349770715,\n \"\
acc_norm\": 0.7174311926605504,\n \"acc_norm_stderr\": 0.01930424349770715\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.39814814814814814,\n \"acc_stderr\": 0.033384734032074016,\n \"\
acc_norm\": 0.39814814814814814,\n \"acc_norm_stderr\": 0.033384734032074016\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7696078431372549,\n \"acc_stderr\": 0.02955429260569507,\n \"\
acc_norm\": 0.7696078431372549,\n \"acc_norm_stderr\": 0.02955429260569507\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7172995780590717,\n \"acc_stderr\": 0.02931281415395592,\n \
\ \"acc_norm\": 0.7172995780590717,\n \"acc_norm_stderr\": 0.02931281415395592\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6412556053811659,\n\
\ \"acc_stderr\": 0.03219079200419994,\n \"acc_norm\": 0.6412556053811659,\n\
\ \"acc_norm_stderr\": 0.03219079200419994\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6412213740458015,\n \"acc_stderr\": 0.04206739313864908,\n\
\ \"acc_norm\": 0.6412213740458015,\n \"acc_norm_stderr\": 0.04206739313864908\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516303,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516303\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6018518518518519,\n\
\ \"acc_stderr\": 0.04732332615978815,\n \"acc_norm\": 0.6018518518518519,\n\
\ \"acc_norm_stderr\": 0.04732332615978815\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6441717791411042,\n \"acc_stderr\": 0.03761521380046734,\n\
\ \"acc_norm\": 0.6441717791411042,\n \"acc_norm_stderr\": 0.03761521380046734\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3392857142857143,\n\
\ \"acc_stderr\": 0.04493949068613539,\n \"acc_norm\": 0.3392857142857143,\n\
\ \"acc_norm_stderr\": 0.04493949068613539\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6990291262135923,\n \"acc_stderr\": 0.045416094465039476,\n\
\ \"acc_norm\": 0.6990291262135923,\n \"acc_norm_stderr\": 0.045416094465039476\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7991452991452992,\n\
\ \"acc_stderr\": 0.02624677294689048,\n \"acc_norm\": 0.7991452991452992,\n\
\ \"acc_norm_stderr\": 0.02624677294689048\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7394636015325671,\n\
\ \"acc_stderr\": 0.01569600856380708,\n \"acc_norm\": 0.7394636015325671,\n\
\ \"acc_norm_stderr\": 0.01569600856380708\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.02572280220089581,\n\
\ \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.02572280220089581\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4324022346368715,\n\
\ \"acc_stderr\": 0.01656897123354861,\n \"acc_norm\": 0.4324022346368715,\n\
\ \"acc_norm_stderr\": 0.01656897123354861\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6078431372549019,\n \"acc_stderr\": 0.027956046165424516,\n\
\ \"acc_norm\": 0.6078431372549019,\n \"acc_norm_stderr\": 0.027956046165424516\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.617363344051447,\n\
\ \"acc_stderr\": 0.02760468902858199,\n \"acc_norm\": 0.617363344051447,\n\
\ \"acc_norm_stderr\": 0.02760468902858199\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.02622964917882116,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.02622964917882116\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3971631205673759,\n \"acc_stderr\": 0.029189805673587095,\n \
\ \"acc_norm\": 0.3971631205673759,\n \"acc_norm_stderr\": 0.029189805673587095\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4061277705345502,\n\
\ \"acc_stderr\": 0.012543154588412932,\n \"acc_norm\": 0.4061277705345502,\n\
\ \"acc_norm_stderr\": 0.012543154588412932\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5294117647058824,\n \"acc_stderr\": 0.03032024326500413,\n\
\ \"acc_norm\": 0.5294117647058824,\n \"acc_norm_stderr\": 0.03032024326500413\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.576797385620915,\n \"acc_stderr\": 0.019987809769482064,\n \
\ \"acc_norm\": 0.576797385620915,\n \"acc_norm_stderr\": 0.019987809769482064\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5591836734693878,\n \"acc_stderr\": 0.03178419114175363,\n\
\ \"acc_norm\": 0.5591836734693878,\n \"acc_norm_stderr\": 0.03178419114175363\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7114427860696517,\n\
\ \"acc_stderr\": 0.03203841040213321,\n \"acc_norm\": 0.7114427860696517,\n\
\ \"acc_norm_stderr\": 0.03203841040213321\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826368,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826368\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4397590361445783,\n\
\ \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.4397590361445783,\n\
\ \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.0312678171466318,\n\
\ \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.0312678171466318\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.34149326805385555,\n\
\ \"mc1_stderr\": 0.016600688619950833,\n \"mc2\": 0.47789419168905956,\n\
\ \"mc2_stderr\": 0.015029215162188553\n }\n}\n```"
repo_url: https://huggingface.co/jondurbin/airoboros-l2-13b-3.0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_08T18_14_31.712178
path:
- '**/details_harness|arc:challenge|25_2023-10-08T18-14-31.712178.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-08T18-14-31.712178.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_08T18_14_31.712178
path:
- '**/details_harness|hellaswag|10_2023-10-08T18-14-31.712178.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-08T18-14-31.712178.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_08T18_14_31.712178
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T18-14-31.712178.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T18-14-31.712178.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T18-14-31.712178.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_08T18_14_31.712178
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T18-14-31.712178.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T18-14-31.712178.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_08T18_14_31.712178
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T18-14-31.712178.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T18-14-31.712178.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_08T18_14_31.712178
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T18-14-31.712178.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T18-14-31.712178.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_08T18_14_31.712178
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T18-14-31.712178.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T18-14-31.712178.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_08T18_14_31.712178
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T18-14-31.712178.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T18-14-31.712178.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_08T18_14_31.712178
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T18-14-31.712178.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T18-14-31.712178.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_08T18_14_31.712178
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T18-14-31.712178.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T18-14-31.712178.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_08T18_14_31.712178
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T18-14-31.712178.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T18-14-31.712178.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_08T18_14_31.712178
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T18-14-31.712178.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T18-14-31.712178.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_08T18_14_31.712178
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T18-14-31.712178.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T18-14-31.712178.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_08T18_14_31.712178
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T18-14-31.712178.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T18-14-31.712178.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_08T18_14_31.712178
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T18-14-31.712178.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T18-14-31.712178.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_08T18_14_31.712178
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T18-14-31.712178.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T18-14-31.712178.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_08T18_14_31.712178
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T18-14-31.712178.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T18-14-31.712178.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_08T18_14_31.712178
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T18-14-31.712178.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T18-14-31.712178.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_08T18_14_31.712178
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T18-14-31.712178.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T18-14-31.712178.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_08T18_14_31.712178
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T18-14-31.712178.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T18-14-31.712178.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_08T18_14_31.712178
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T18-14-31.712178.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T18-14-31.712178.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_08T18_14_31.712178
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T18-14-31.712178.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T18-14-31.712178.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_08T18_14_31.712178
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T18-14-31.712178.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T18-14-31.712178.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_08T18_14_31.712178
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T18-14-31.712178.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T18-14-31.712178.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_08T18_14_31.712178
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T18-14-31.712178.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T18-14-31.712178.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_08T18_14_31.712178
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T18-14-31.712178.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T18-14-31.712178.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_08T18_14_31.712178
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T18-14-31.712178.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T18-14-31.712178.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_08T18_14_31.712178
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T18-14-31.712178.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T18-14-31.712178.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_08T18_14_31.712178
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T18-14-31.712178.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T18-14-31.712178.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_08T18_14_31.712178
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T18-14-31.712178.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T18-14-31.712178.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_08T18_14_31.712178
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T18-14-31.712178.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T18-14-31.712178.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_08T18_14_31.712178
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T18-14-31.712178.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T18-14-31.712178.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_08T18_14_31.712178
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T18-14-31.712178.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T18-14-31.712178.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_08T18_14_31.712178
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T18-14-31.712178.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T18-14-31.712178.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_08T18_14_31.712178
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T18-14-31.712178.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T18-14-31.712178.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_08T18_14_31.712178
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T18-14-31.712178.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T18-14-31.712178.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_08T18_14_31.712178
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T18-14-31.712178.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T18-14-31.712178.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_08T18_14_31.712178
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T18-14-31.712178.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T18-14-31.712178.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_08T18_14_31.712178
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T18-14-31.712178.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T18-14-31.712178.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_08T18_14_31.712178
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T18-14-31.712178.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T18-14-31.712178.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_08T18_14_31.712178
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T18-14-31.712178.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T18-14-31.712178.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_08T18_14_31.712178
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-08T18-14-31.712178.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-08T18-14-31.712178.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_08T18_14_31.712178
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T18-14-31.712178.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T18-14-31.712178.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_08T18_14_31.712178
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T18-14-31.712178.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T18-14-31.712178.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_08T18_14_31.712178
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T18-14-31.712178.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T18-14-31.712178.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_08T18_14_31.712178
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T18-14-31.712178.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T18-14-31.712178.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_08T18_14_31.712178
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T18-14-31.712178.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T18-14-31.712178.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_08T18_14_31.712178
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T18-14-31.712178.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T18-14-31.712178.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_08T18_14_31.712178
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T18-14-31.712178.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T18-14-31.712178.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_08T18_14_31.712178
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T18-14-31.712178.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T18-14-31.712178.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_08T18_14_31.712178
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T18-14-31.712178.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T18-14-31.712178.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_08T18_14_31.712178
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T18-14-31.712178.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T18-14-31.712178.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_08T18_14_31.712178
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T18-14-31.712178.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T18-14-31.712178.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_08T18_14_31.712178
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T18-14-31.712178.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T18-14-31.712178.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_08T18_14_31.712178
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T18-14-31.712178.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T18-14-31.712178.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_08T18_14_31.712178
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T18-14-31.712178.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T18-14-31.712178.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_08T18_14_31.712178
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T18-14-31.712178.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T18-14-31.712178.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_08T18_14_31.712178
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T18-14-31.712178.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T18-14-31.712178.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_08T18_14_31.712178
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T18-14-31.712178.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T18-14-31.712178.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_08T18_14_31.712178
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T18-14-31.712178.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T18-14-31.712178.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_08T18_14_31.712178
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-08T18-14-31.712178.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-08T18-14-31.712178.parquet'
- config_name: results
data_files:
- split: 2023_10_08T18_14_31.712178
path:
- results_2023-10-08T18-14-31.712178.parquet
- split: latest
path:
- results_2023-10-08T18-14-31.712178.parquet
---
# Dataset Card for Evaluation run of jondurbin/airoboros-l2-13b-3.0
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/jondurbin/airoboros-l2-13b-3.0
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [jondurbin/airoboros-l2-13b-3.0](https://huggingface.co/jondurbin/airoboros-l2-13b-3.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jondurbin__airoboros-l2-13b-3.0",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-08T18:14:31.712178](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-l2-13b-3.0/blob/main/results_2023-10-08T18-14-31.712178.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5502881382765671,
"acc_stderr": 0.034374799430363756,
"acc_norm": 0.5543095430488207,
"acc_norm_stderr": 0.03435303362752815,
"mc1": 0.34149326805385555,
"mc1_stderr": 0.016600688619950833,
"mc2": 0.47789419168905956,
"mc2_stderr": 0.015029215162188553
},
"harness|arc:challenge|25": {
"acc": 0.5656996587030717,
"acc_stderr": 0.01448470304885736,
"acc_norm": 0.5981228668941979,
"acc_norm_stderr": 0.014327268614578276
},
"harness|hellaswag|10": {
"acc": 0.632244572794264,
"acc_stderr": 0.004812088620277185,
"acc_norm": 0.8370842461661023,
"acc_norm_stderr": 0.003685340687255413
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4962962962962963,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.4962962962962963,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5460526315789473,
"acc_stderr": 0.04051646342874141,
"acc_norm": 0.5460526315789473,
"acc_norm_stderr": 0.04051646342874141
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5735849056603773,
"acc_stderr": 0.03043779434298305,
"acc_norm": 0.5735849056603773,
"acc_norm_stderr": 0.03043779434298305
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6041666666666666,
"acc_stderr": 0.04089465449325582,
"acc_norm": 0.6041666666666666,
"acc_norm_stderr": 0.04089465449325582
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5144508670520231,
"acc_stderr": 0.03810871630454764,
"acc_norm": 0.5144508670520231,
"acc_norm_stderr": 0.03810871630454764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.04220773659171452,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.04220773659171452
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720685,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720685
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4,
"acc_stderr": 0.03202563076101735,
"acc_norm": 0.4,
"acc_norm_stderr": 0.03202563076101735
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.043727482902780064,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.043727482902780064
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728762,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728762
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3253968253968254,
"acc_stderr": 0.024130158299762613,
"acc_norm": 0.3253968253968254,
"acc_norm_stderr": 0.024130158299762613
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.0404061017820884,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.0404061017820884
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6225806451612903,
"acc_stderr": 0.027575960723278233,
"acc_norm": 0.6225806451612903,
"acc_norm_stderr": 0.027575960723278233
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3793103448275862,
"acc_stderr": 0.034139638059062345,
"acc_norm": 0.3793103448275862,
"acc_norm_stderr": 0.034139638059062345
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6424242424242425,
"acc_stderr": 0.037425970438065864,
"acc_norm": 0.6424242424242425,
"acc_norm_stderr": 0.037425970438065864
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.0331847733384533,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.0331847733384533
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8082901554404145,
"acc_stderr": 0.02840895362624526,
"acc_norm": 0.8082901554404145,
"acc_norm_stderr": 0.02840895362624526
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.49743589743589745,
"acc_stderr": 0.025350672979412195,
"acc_norm": 0.49743589743589745,
"acc_norm_stderr": 0.025350672979412195
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.026719240783712166,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.026719240783712166
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5126050420168067,
"acc_stderr": 0.03246816765752174,
"acc_norm": 0.5126050420168067,
"acc_norm_stderr": 0.03246816765752174
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658753,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658753
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7174311926605504,
"acc_stderr": 0.01930424349770715,
"acc_norm": 0.7174311926605504,
"acc_norm_stderr": 0.01930424349770715
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.39814814814814814,
"acc_stderr": 0.033384734032074016,
"acc_norm": 0.39814814814814814,
"acc_norm_stderr": 0.033384734032074016
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7696078431372549,
"acc_stderr": 0.02955429260569507,
"acc_norm": 0.7696078431372549,
"acc_norm_stderr": 0.02955429260569507
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7172995780590717,
"acc_stderr": 0.02931281415395592,
"acc_norm": 0.7172995780590717,
"acc_norm_stderr": 0.02931281415395592
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6412556053811659,
"acc_stderr": 0.03219079200419994,
"acc_norm": 0.6412556053811659,
"acc_norm_stderr": 0.03219079200419994
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6412213740458015,
"acc_stderr": 0.04206739313864908,
"acc_norm": 0.6412213740458015,
"acc_norm_stderr": 0.04206739313864908
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516303,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516303
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6018518518518519,
"acc_stderr": 0.04732332615978815,
"acc_norm": 0.6018518518518519,
"acc_norm_stderr": 0.04732332615978815
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6441717791411042,
"acc_stderr": 0.03761521380046734,
"acc_norm": 0.6441717791411042,
"acc_norm_stderr": 0.03761521380046734
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3392857142857143,
"acc_stderr": 0.04493949068613539,
"acc_norm": 0.3392857142857143,
"acc_norm_stderr": 0.04493949068613539
},
"harness|hendrycksTest-management|5": {
"acc": 0.6990291262135923,
"acc_stderr": 0.045416094465039476,
"acc_norm": 0.6990291262135923,
"acc_norm_stderr": 0.045416094465039476
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7991452991452992,
"acc_stderr": 0.02624677294689048,
"acc_norm": 0.7991452991452992,
"acc_norm_stderr": 0.02624677294689048
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7394636015325671,
"acc_stderr": 0.01569600856380708,
"acc_norm": 0.7394636015325671,
"acc_norm_stderr": 0.01569600856380708
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.02572280220089581,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.02572280220089581
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4324022346368715,
"acc_stderr": 0.01656897123354861,
"acc_norm": 0.4324022346368715,
"acc_norm_stderr": 0.01656897123354861
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6078431372549019,
"acc_stderr": 0.027956046165424516,
"acc_norm": 0.6078431372549019,
"acc_norm_stderr": 0.027956046165424516
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.617363344051447,
"acc_stderr": 0.02760468902858199,
"acc_norm": 0.617363344051447,
"acc_norm_stderr": 0.02760468902858199
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.02622964917882116,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.02622964917882116
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3971631205673759,
"acc_stderr": 0.029189805673587095,
"acc_norm": 0.3971631205673759,
"acc_norm_stderr": 0.029189805673587095
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4061277705345502,
"acc_stderr": 0.012543154588412932,
"acc_norm": 0.4061277705345502,
"acc_norm_stderr": 0.012543154588412932
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5294117647058824,
"acc_stderr": 0.03032024326500413,
"acc_norm": 0.5294117647058824,
"acc_norm_stderr": 0.03032024326500413
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.576797385620915,
"acc_stderr": 0.019987809769482064,
"acc_norm": 0.576797385620915,
"acc_norm_stderr": 0.019987809769482064
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5591836734693878,
"acc_stderr": 0.03178419114175363,
"acc_norm": 0.5591836734693878,
"acc_norm_stderr": 0.03178419114175363
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7114427860696517,
"acc_stderr": 0.03203841040213321,
"acc_norm": 0.7114427860696517,
"acc_norm_stderr": 0.03203841040213321
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826368,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826368
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4397590361445783,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.4397590361445783,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.0312678171466318,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.0312678171466318
},
"harness|truthfulqa:mc|0": {
"mc1": 0.34149326805385555,
"mc1_stderr": 0.016600688619950833,
"mc2": 0.47789419168905956,
"mc2_stderr": 0.015029215162188553
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Antonio49/ULTIMAPRUEBA | 2023-10-08T18:21:50.000Z | [
"license:apache-2.0",
"region:us"
] | Antonio49 | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
SumanthRH/rte_few_shot_arc | 2023-10-08T18:26:17.000Z | [
"region:us"
] | SumanthRH | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: problem
dtype: string
- name: solution
dtype: string
- name: response
dtype: string
- name: message
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 437310
num_examples: 200
download_size: 125236
dataset_size: 437310
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "rte_few_shot_arc"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_JosephusCheung__Pwen-14B-Chat-20_30 | 2023-10-08T18:26:50.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of JosephusCheung/Pwen-14B-Chat-20_30
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [JosephusCheung/Pwen-14B-Chat-20_30](https://huggingface.co/JosephusCheung/Pwen-14B-Chat-20_30)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_JosephusCheung__Pwen-14B-Chat-20_30\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-08T18:25:24.586385](https://huggingface.co/datasets/open-llm-leaderboard/details_JosephusCheung__Pwen-14B-Chat-20_30/blob/main/results_2023-10-08T18-25-24.586385.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5990888068369461,\n\
\ \"acc_stderr\": 0.03431005125414193,\n \"acc_norm\": 0.6027895245963486,\n\
\ \"acc_norm_stderr\": 0.03429409520845181,\n \"mc1\": 0.3268053855569155,\n\
\ \"mc1_stderr\": 0.01641987473113503,\n \"mc2\": 0.4701781470729953,\n\
\ \"mc2_stderr\": 0.014777434418052576\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5290102389078498,\n \"acc_stderr\": 0.014586776355294321,\n\
\ \"acc_norm\": 0.5614334470989761,\n \"acc_norm_stderr\": 0.014500682618212864\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.611929894443338,\n\
\ \"acc_stderr\": 0.004863147544177516,\n \"acc_norm\": 0.7978490340569607,\n\
\ \"acc_norm_stderr\": 0.004007834585541846\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n\
\ \"acc_stderr\": 0.04284958639753401,\n \"acc_norm\": 0.562962962962963,\n\
\ \"acc_norm_stderr\": 0.04284958639753401\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119668,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119668\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.68,\n\
\ \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n \
\ \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6528301886792452,\n \"acc_stderr\": 0.029300101705549652,\n\
\ \"acc_norm\": 0.6528301886792452,\n \"acc_norm_stderr\": 0.029300101705549652\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7083333333333334,\n\
\ \"acc_stderr\": 0.038009680605548594,\n \"acc_norm\": 0.7083333333333334,\n\
\ \"acc_norm_stderr\": 0.038009680605548594\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\"\
: 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n\
\ \"acc_stderr\": 0.03703851193099522,\n \"acc_norm\": 0.6184971098265896,\n\
\ \"acc_norm_stderr\": 0.03703851193099522\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.43829787234042555,\n \"acc_stderr\": 0.03243618636108101,\n\
\ \"acc_norm\": 0.43829787234042555,\n \"acc_norm_stderr\": 0.03243618636108101\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n\
\ \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n\
\ \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4947089947089947,\n \"acc_stderr\": 0.02574986828855657,\n \"\
acc_norm\": 0.4947089947089947,\n \"acc_norm_stderr\": 0.02574986828855657\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n\
\ \"acc_stderr\": 0.0437588849272706,\n \"acc_norm\": 0.3968253968253968,\n\
\ \"acc_norm_stderr\": 0.0437588849272706\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7032258064516129,\n \"acc_stderr\": 0.025988500792411905,\n \"\
acc_norm\": 0.7032258064516129,\n \"acc_norm_stderr\": 0.025988500792411905\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.541871921182266,\n \"acc_stderr\": 0.03505630140785741,\n \"acc_norm\"\
: 0.541871921182266,\n \"acc_norm_stderr\": 0.03505630140785741\n },\n\
\ \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\"\
: 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n\
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.4909090909090909,\n \"acc_stderr\": 0.039036986477484395,\n\
\ \"acc_norm\": 0.4909090909090909,\n \"acc_norm_stderr\": 0.039036986477484395\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.02860620428922987,\n \"acc_norm\"\
: 0.797979797979798,\n \"acc_norm_stderr\": 0.02860620428922987\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.7927461139896373,\n \"acc_stderr\": 0.02925282329180363,\n\
\ \"acc_norm\": 0.7927461139896373,\n \"acc_norm_stderr\": 0.02925282329180363\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6076923076923076,\n \"acc_stderr\": 0.02475600038213095,\n \
\ \"acc_norm\": 0.6076923076923076,\n \"acc_norm_stderr\": 0.02475600038213095\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.362962962962963,\n \"acc_stderr\": 0.02931820364520686,\n \
\ \"acc_norm\": 0.362962962962963,\n \"acc_norm_stderr\": 0.02931820364520686\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \
\ \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.423841059602649,\n \"acc_stderr\": 0.040348466786033974,\n \"\
acc_norm\": 0.423841059602649,\n \"acc_norm_stderr\": 0.040348466786033974\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7834862385321101,\n \"acc_stderr\": 0.01765871059444313,\n \"\
acc_norm\": 0.7834862385321101,\n \"acc_norm_stderr\": 0.01765871059444313\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6176470588235294,\n \"acc_stderr\": 0.03410785338904719,\n \"\
acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.03410785338904719\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7637130801687764,\n \"acc_stderr\": 0.02765215314415927,\n \
\ \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.02765215314415927\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5874439461883408,\n\
\ \"acc_stderr\": 0.03304062175449297,\n \"acc_norm\": 0.5874439461883408,\n\
\ \"acc_norm_stderr\": 0.03304062175449297\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6946564885496184,\n \"acc_stderr\": 0.0403931497872456,\n\
\ \"acc_norm\": 0.6946564885496184,\n \"acc_norm_stderr\": 0.0403931497872456\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6388888888888888,\n\
\ \"acc_stderr\": 0.04643454608906275,\n \"acc_norm\": 0.6388888888888888,\n\
\ \"acc_norm_stderr\": 0.04643454608906275\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6748466257668712,\n \"acc_stderr\": 0.03680350371286461,\n\
\ \"acc_norm\": 0.6748466257668712,\n \"acc_norm_stderr\": 0.03680350371286461\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4017857142857143,\n\
\ \"acc_stderr\": 0.04653333146973646,\n \"acc_norm\": 0.4017857142857143,\n\
\ \"acc_norm_stderr\": 0.04653333146973646\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8076923076923077,\n\
\ \"acc_stderr\": 0.02581923325648372,\n \"acc_norm\": 0.8076923076923077,\n\
\ \"acc_norm_stderr\": 0.02581923325648372\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7496807151979565,\n\
\ \"acc_stderr\": 0.015491088951494583,\n \"acc_norm\": 0.7496807151979565,\n\
\ \"acc_norm_stderr\": 0.015491088951494583\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6271676300578035,\n \"acc_stderr\": 0.026033890613576277,\n\
\ \"acc_norm\": 0.6271676300578035,\n \"acc_norm_stderr\": 0.026033890613576277\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.38212290502793295,\n\
\ \"acc_stderr\": 0.016251139711570765,\n \"acc_norm\": 0.38212290502793295,\n\
\ \"acc_norm_stderr\": 0.016251139711570765\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6928104575163399,\n \"acc_stderr\": 0.026415601914388992,\n\
\ \"acc_norm\": 0.6928104575163399,\n \"acc_norm_stderr\": 0.026415601914388992\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6527331189710611,\n\
\ \"acc_stderr\": 0.027040745502307336,\n \"acc_norm\": 0.6527331189710611,\n\
\ \"acc_norm_stderr\": 0.027040745502307336\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6388888888888888,\n \"acc_stderr\": 0.026725868809100793,\n\
\ \"acc_norm\": 0.6388888888888888,\n \"acc_norm_stderr\": 0.026725868809100793\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46099290780141844,\n \"acc_stderr\": 0.02973659252642444,\n \
\ \"acc_norm\": 0.46099290780141844,\n \"acc_norm_stderr\": 0.02973659252642444\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.423728813559322,\n\
\ \"acc_stderr\": 0.012620785155885996,\n \"acc_norm\": 0.423728813559322,\n\
\ \"acc_norm_stderr\": 0.012620785155885996\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6286764705882353,\n \"acc_stderr\": 0.02934980313976587,\n\
\ \"acc_norm\": 0.6286764705882353,\n \"acc_norm_stderr\": 0.02934980313976587\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6062091503267973,\n \"acc_stderr\": 0.019766211991073056,\n \
\ \"acc_norm\": 0.6062091503267973,\n \"acc_norm_stderr\": 0.019766211991073056\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6090909090909091,\n\
\ \"acc_stderr\": 0.04673752333670239,\n \"acc_norm\": 0.6090909090909091,\n\
\ \"acc_norm_stderr\": 0.04673752333670239\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6857142857142857,\n \"acc_stderr\": 0.02971932942241748,\n\
\ \"acc_norm\": 0.6857142857142857,\n \"acc_norm_stderr\": 0.02971932942241748\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7810945273631841,\n\
\ \"acc_stderr\": 0.029239174636647,\n \"acc_norm\": 0.7810945273631841,\n\
\ \"acc_norm_stderr\": 0.029239174636647\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4578313253012048,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.4578313253012048,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.03158149539338734,\n\
\ \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.03158149539338734\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3268053855569155,\n\
\ \"mc1_stderr\": 0.01641987473113503,\n \"mc2\": 0.4701781470729953,\n\
\ \"mc2_stderr\": 0.014777434418052576\n }\n}\n```"
repo_url: https://huggingface.co/JosephusCheung/Pwen-14B-Chat-20_30
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_08T18_25_24.586385
path:
- '**/details_harness|arc:challenge|25_2023-10-08T18-25-24.586385.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-08T18-25-24.586385.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_08T18_25_24.586385
path:
- '**/details_harness|hellaswag|10_2023-10-08T18-25-24.586385.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-08T18-25-24.586385.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_08T18_25_24.586385
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T18-25-24.586385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T18-25-24.586385.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T18-25-24.586385.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_08T18_25_24.586385
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T18-25-24.586385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T18-25-24.586385.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_08T18_25_24.586385
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T18-25-24.586385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T18-25-24.586385.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_08T18_25_24.586385
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T18-25-24.586385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T18-25-24.586385.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_08T18_25_24.586385
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T18-25-24.586385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T18-25-24.586385.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_08T18_25_24.586385
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T18-25-24.586385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T18-25-24.586385.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_08T18_25_24.586385
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T18-25-24.586385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T18-25-24.586385.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_08T18_25_24.586385
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T18-25-24.586385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T18-25-24.586385.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_08T18_25_24.586385
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T18-25-24.586385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T18-25-24.586385.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_08T18_25_24.586385
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T18-25-24.586385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T18-25-24.586385.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_08T18_25_24.586385
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T18-25-24.586385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T18-25-24.586385.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_08T18_25_24.586385
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T18-25-24.586385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T18-25-24.586385.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_08T18_25_24.586385
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T18-25-24.586385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T18-25-24.586385.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_08T18_25_24.586385
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T18-25-24.586385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T18-25-24.586385.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_08T18_25_24.586385
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T18-25-24.586385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T18-25-24.586385.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_08T18_25_24.586385
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T18-25-24.586385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T18-25-24.586385.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_08T18_25_24.586385
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T18-25-24.586385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T18-25-24.586385.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_08T18_25_24.586385
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T18-25-24.586385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T18-25-24.586385.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_08T18_25_24.586385
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T18-25-24.586385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T18-25-24.586385.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_08T18_25_24.586385
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T18-25-24.586385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T18-25-24.586385.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_08T18_25_24.586385
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T18-25-24.586385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T18-25-24.586385.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_08T18_25_24.586385
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T18-25-24.586385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T18-25-24.586385.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_08T18_25_24.586385
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T18-25-24.586385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T18-25-24.586385.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_08T18_25_24.586385
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T18-25-24.586385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T18-25-24.586385.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_08T18_25_24.586385
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T18-25-24.586385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T18-25-24.586385.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_08T18_25_24.586385
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T18-25-24.586385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T18-25-24.586385.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_08T18_25_24.586385
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T18-25-24.586385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T18-25-24.586385.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_08T18_25_24.586385
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T18-25-24.586385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T18-25-24.586385.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_08T18_25_24.586385
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T18-25-24.586385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T18-25-24.586385.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_08T18_25_24.586385
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T18-25-24.586385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T18-25-24.586385.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_08T18_25_24.586385
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T18-25-24.586385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T18-25-24.586385.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_08T18_25_24.586385
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T18-25-24.586385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T18-25-24.586385.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_08T18_25_24.586385
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T18-25-24.586385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T18-25-24.586385.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_08T18_25_24.586385
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T18-25-24.586385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T18-25-24.586385.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_08T18_25_24.586385
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T18-25-24.586385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T18-25-24.586385.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_08T18_25_24.586385
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T18-25-24.586385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T18-25-24.586385.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_08T18_25_24.586385
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T18-25-24.586385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T18-25-24.586385.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_08T18_25_24.586385
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T18-25-24.586385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T18-25-24.586385.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_08T18_25_24.586385
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T18-25-24.586385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T18-25-24.586385.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_08T18_25_24.586385
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-08T18-25-24.586385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-08T18-25-24.586385.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_08T18_25_24.586385
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T18-25-24.586385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T18-25-24.586385.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_08T18_25_24.586385
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T18-25-24.586385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T18-25-24.586385.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_08T18_25_24.586385
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T18-25-24.586385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T18-25-24.586385.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_08T18_25_24.586385
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T18-25-24.586385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T18-25-24.586385.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_08T18_25_24.586385
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T18-25-24.586385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T18-25-24.586385.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_08T18_25_24.586385
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T18-25-24.586385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T18-25-24.586385.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_08T18_25_24.586385
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T18-25-24.586385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T18-25-24.586385.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_08T18_25_24.586385
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T18-25-24.586385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T18-25-24.586385.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_08T18_25_24.586385
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T18-25-24.586385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T18-25-24.586385.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_08T18_25_24.586385
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T18-25-24.586385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T18-25-24.586385.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_08T18_25_24.586385
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T18-25-24.586385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T18-25-24.586385.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_08T18_25_24.586385
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T18-25-24.586385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T18-25-24.586385.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_08T18_25_24.586385
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T18-25-24.586385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T18-25-24.586385.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_08T18_25_24.586385
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T18-25-24.586385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T18-25-24.586385.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_08T18_25_24.586385
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T18-25-24.586385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T18-25-24.586385.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_08T18_25_24.586385
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T18-25-24.586385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T18-25-24.586385.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_08T18_25_24.586385
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T18-25-24.586385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T18-25-24.586385.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_08T18_25_24.586385
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T18-25-24.586385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T18-25-24.586385.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_08T18_25_24.586385
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-08T18-25-24.586385.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-08T18-25-24.586385.parquet'
- config_name: results
data_files:
- split: 2023_10_08T18_25_24.586385
path:
- results_2023-10-08T18-25-24.586385.parquet
- split: latest
path:
- results_2023-10-08T18-25-24.586385.parquet
---
# Dataset Card for Evaluation run of JosephusCheung/Pwen-14B-Chat-20_30
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/JosephusCheung/Pwen-14B-Chat-20_30
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [JosephusCheung/Pwen-14B-Chat-20_30](https://huggingface.co/JosephusCheung/Pwen-14B-Chat-20_30) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_JosephusCheung__Pwen-14B-Chat-20_30",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-08T18:25:24.586385](https://huggingface.co/datasets/open-llm-leaderboard/details_JosephusCheung__Pwen-14B-Chat-20_30/blob/main/results_2023-10-08T18-25-24.586385.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5990888068369461,
"acc_stderr": 0.03431005125414193,
"acc_norm": 0.6027895245963486,
"acc_norm_stderr": 0.03429409520845181,
"mc1": 0.3268053855569155,
"mc1_stderr": 0.01641987473113503,
"mc2": 0.4701781470729953,
"mc2_stderr": 0.014777434418052576
},
"harness|arc:challenge|25": {
"acc": 0.5290102389078498,
"acc_stderr": 0.014586776355294321,
"acc_norm": 0.5614334470989761,
"acc_norm_stderr": 0.014500682618212864
},
"harness|hellaswag|10": {
"acc": 0.611929894443338,
"acc_stderr": 0.004863147544177516,
"acc_norm": 0.7978490340569607,
"acc_norm_stderr": 0.004007834585541846
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.562962962962963,
"acc_stderr": 0.04284958639753401,
"acc_norm": 0.562962962962963,
"acc_norm_stderr": 0.04284958639753401
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119668,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119668
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6528301886792452,
"acc_stderr": 0.029300101705549652,
"acc_norm": 0.6528301886792452,
"acc_norm_stderr": 0.029300101705549652
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7083333333333334,
"acc_stderr": 0.038009680605548594,
"acc_norm": 0.7083333333333334,
"acc_norm_stderr": 0.038009680605548594
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.03703851193099522,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.03703851193099522
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.43829787234042555,
"acc_stderr": 0.03243618636108101,
"acc_norm": 0.43829787234042555,
"acc_norm_stderr": 0.03243618636108101
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4947089947089947,
"acc_stderr": 0.02574986828855657,
"acc_norm": 0.4947089947089947,
"acc_norm_stderr": 0.02574986828855657
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.0437588849272706,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.0437588849272706
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7032258064516129,
"acc_stderr": 0.025988500792411905,
"acc_norm": 0.7032258064516129,
"acc_norm_stderr": 0.025988500792411905
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.541871921182266,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.541871921182266,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.4909090909090909,
"acc_stderr": 0.039036986477484395,
"acc_norm": 0.4909090909090909,
"acc_norm_stderr": 0.039036986477484395
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.02860620428922987,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.02860620428922987
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7927461139896373,
"acc_stderr": 0.02925282329180363,
"acc_norm": 0.7927461139896373,
"acc_norm_stderr": 0.02925282329180363
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6076923076923076,
"acc_stderr": 0.02475600038213095,
"acc_norm": 0.6076923076923076,
"acc_norm_stderr": 0.02475600038213095
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.362962962962963,
"acc_stderr": 0.02931820364520686,
"acc_norm": 0.362962962962963,
"acc_norm_stderr": 0.02931820364520686
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.423841059602649,
"acc_stderr": 0.040348466786033974,
"acc_norm": 0.423841059602649,
"acc_norm_stderr": 0.040348466786033974
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7834862385321101,
"acc_stderr": 0.01765871059444313,
"acc_norm": 0.7834862385321101,
"acc_norm_stderr": 0.01765871059444313
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.03410785338904719,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.03410785338904719
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7637130801687764,
"acc_stderr": 0.02765215314415927,
"acc_norm": 0.7637130801687764,
"acc_norm_stderr": 0.02765215314415927
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5874439461883408,
"acc_stderr": 0.03304062175449297,
"acc_norm": 0.5874439461883408,
"acc_norm_stderr": 0.03304062175449297
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6946564885496184,
"acc_stderr": 0.0403931497872456,
"acc_norm": 0.6946564885496184,
"acc_norm_stderr": 0.0403931497872456
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.04643454608906275,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.04643454608906275
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6748466257668712,
"acc_stderr": 0.03680350371286461,
"acc_norm": 0.6748466257668712,
"acc_norm_stderr": 0.03680350371286461
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4017857142857143,
"acc_stderr": 0.04653333146973646,
"acc_norm": 0.4017857142857143,
"acc_norm_stderr": 0.04653333146973646
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8076923076923077,
"acc_stderr": 0.02581923325648372,
"acc_norm": 0.8076923076923077,
"acc_norm_stderr": 0.02581923325648372
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7496807151979565,
"acc_stderr": 0.015491088951494583,
"acc_norm": 0.7496807151979565,
"acc_norm_stderr": 0.015491088951494583
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6271676300578035,
"acc_stderr": 0.026033890613576277,
"acc_norm": 0.6271676300578035,
"acc_norm_stderr": 0.026033890613576277
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.38212290502793295,
"acc_stderr": 0.016251139711570765,
"acc_norm": 0.38212290502793295,
"acc_norm_stderr": 0.016251139711570765
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6928104575163399,
"acc_stderr": 0.026415601914388992,
"acc_norm": 0.6928104575163399,
"acc_norm_stderr": 0.026415601914388992
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6527331189710611,
"acc_stderr": 0.027040745502307336,
"acc_norm": 0.6527331189710611,
"acc_norm_stderr": 0.027040745502307336
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.026725868809100793,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.026725868809100793
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46099290780141844,
"acc_stderr": 0.02973659252642444,
"acc_norm": 0.46099290780141844,
"acc_norm_stderr": 0.02973659252642444
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.423728813559322,
"acc_stderr": 0.012620785155885996,
"acc_norm": 0.423728813559322,
"acc_norm_stderr": 0.012620785155885996
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6286764705882353,
"acc_stderr": 0.02934980313976587,
"acc_norm": 0.6286764705882353,
"acc_norm_stderr": 0.02934980313976587
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6062091503267973,
"acc_stderr": 0.019766211991073056,
"acc_norm": 0.6062091503267973,
"acc_norm_stderr": 0.019766211991073056
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6090909090909091,
"acc_stderr": 0.04673752333670239,
"acc_norm": 0.6090909090909091,
"acc_norm_stderr": 0.04673752333670239
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6857142857142857,
"acc_stderr": 0.02971932942241748,
"acc_norm": 0.6857142857142857,
"acc_norm_stderr": 0.02971932942241748
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7810945273631841,
"acc_stderr": 0.029239174636647,
"acc_norm": 0.7810945273631841,
"acc_norm_stderr": 0.029239174636647
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4578313253012048,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.4578313253012048,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.783625730994152,
"acc_stderr": 0.03158149539338734,
"acc_norm": 0.783625730994152,
"acc_norm_stderr": 0.03158149539338734
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3268053855569155,
"mc1_stderr": 0.01641987473113503,
"mc2": 0.4701781470729953,
"mc2_stderr": 0.014777434418052576
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_Writer__palmyra-20b-chat | 2023-10-08T18:47:21.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Writer/palmyra-20b-chat
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Writer/palmyra-20b-chat](https://huggingface.co/Writer/palmyra-20b-chat) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Writer__palmyra-20b-chat\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-08T18:46:04.606475](https://huggingface.co/datasets/open-llm-leaderboard/details_Writer__palmyra-20b-chat/blob/main/results_2023-10-08T18-46-04.606475.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3560261410296551,\n\
\ \"acc_stderr\": 0.03459296547521772,\n \"acc_norm\": 0.3596410100634869,\n\
\ \"acc_norm_stderr\": 0.03458685051928871,\n \"mc1\": 0.2778457772337821,\n\
\ \"mc1_stderr\": 0.01568092936402465,\n \"mc2\": 0.4316529913423131,\n\
\ \"mc2_stderr\": 0.015655577806555094\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.40102389078498296,\n \"acc_stderr\": 0.014322255790719867,\n\
\ \"acc_norm\": 0.4351535836177474,\n \"acc_norm_stderr\": 0.014487986197186047\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5491933877713603,\n\
\ \"acc_stderr\": 0.004965572246803863,\n \"acc_norm\": 0.7283409679346744,\n\
\ \"acc_norm_stderr\": 0.004439059440526252\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542129,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542129\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3851851851851852,\n\
\ \"acc_stderr\": 0.042039210401562783,\n \"acc_norm\": 0.3851851851851852,\n\
\ \"acc_norm_stderr\": 0.042039210401562783\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.34210526315789475,\n \"acc_stderr\": 0.038607315993160925,\n\
\ \"acc_norm\": 0.34210526315789475,\n \"acc_norm_stderr\": 0.038607315993160925\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.29,\n\
\ \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \
\ \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.37735849056603776,\n \"acc_stderr\": 0.02983280811479601,\n\
\ \"acc_norm\": 0.37735849056603776,\n \"acc_norm_stderr\": 0.02983280811479601\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3611111111111111,\n\
\ \"acc_stderr\": 0.04016660030451233,\n \"acc_norm\": 0.3611111111111111,\n\
\ \"acc_norm_stderr\": 0.04016660030451233\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720683,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720683\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.32947976878612717,\n\
\ \"acc_stderr\": 0.03583901754736411,\n \"acc_norm\": 0.32947976878612717,\n\
\ \"acc_norm_stderr\": 0.03583901754736411\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.18627450980392157,\n \"acc_stderr\": 0.03873958714149351,\n\
\ \"acc_norm\": 0.18627450980392157,\n \"acc_norm_stderr\": 0.03873958714149351\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n\
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3276595744680851,\n \"acc_stderr\": 0.030683020843231004,\n\
\ \"acc_norm\": 0.3276595744680851,\n \"acc_norm_stderr\": 0.030683020843231004\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3157894736842105,\n\
\ \"acc_stderr\": 0.04372748290278007,\n \"acc_norm\": 0.3157894736842105,\n\
\ \"acc_norm_stderr\": 0.04372748290278007\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.41379310344827586,\n \"acc_stderr\": 0.04104269211806232,\n\
\ \"acc_norm\": 0.41379310344827586,\n \"acc_norm_stderr\": 0.04104269211806232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2777777777777778,\n \"acc_stderr\": 0.023068188848261114,\n \"\
acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.023068188848261114\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.31746031746031744,\n\
\ \"acc_stderr\": 0.04163453031302859,\n \"acc_norm\": 0.31746031746031744,\n\
\ \"acc_norm_stderr\": 0.04163453031302859\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3903225806451613,\n\
\ \"acc_stderr\": 0.027751256636969576,\n \"acc_norm\": 0.3903225806451613,\n\
\ \"acc_norm_stderr\": 0.027751256636969576\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.31527093596059114,\n \"acc_stderr\": 0.03269080871970187,\n\
\ \"acc_norm\": 0.31527093596059114,\n \"acc_norm_stderr\": 0.03269080871970187\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\"\
: 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2909090909090909,\n \"acc_stderr\": 0.03546563019624336,\n\
\ \"acc_norm\": 0.2909090909090909,\n \"acc_norm_stderr\": 0.03546563019624336\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.398989898989899,\n \"acc_stderr\": 0.034889016168527305,\n \"\
acc_norm\": 0.398989898989899,\n \"acc_norm_stderr\": 0.034889016168527305\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.45077720207253885,\n \"acc_stderr\": 0.03590910952235524,\n\
\ \"acc_norm\": 0.45077720207253885,\n \"acc_norm_stderr\": 0.03590910952235524\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3487179487179487,\n \"acc_stderr\": 0.02416278028401772,\n \
\ \"acc_norm\": 0.3487179487179487,\n \"acc_norm_stderr\": 0.02416278028401772\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24074074074074073,\n \"acc_stderr\": 0.026067159222275794,\n \
\ \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.026067159222275794\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.36134453781512604,\n \"acc_stderr\": 0.031204691225150006,\n\
\ \"acc_norm\": 0.36134453781512604,\n \"acc_norm_stderr\": 0.031204691225150006\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389023,\n \"\
acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389023\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.4055045871559633,\n \"acc_stderr\": 0.02105099799189684,\n \"\
acc_norm\": 0.4055045871559633,\n \"acc_norm_stderr\": 0.02105099799189684\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2916666666666667,\n \"acc_stderr\": 0.03099866630456052,\n \"\
acc_norm\": 0.2916666666666667,\n \"acc_norm_stderr\": 0.03099866630456052\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.3088235294117647,\n \"acc_stderr\": 0.03242661719827218,\n \"\
acc_norm\": 0.3088235294117647,\n \"acc_norm_stderr\": 0.03242661719827218\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.4050632911392405,\n \"acc_stderr\": 0.031955147413706746,\n \
\ \"acc_norm\": 0.4050632911392405,\n \"acc_norm_stderr\": 0.031955147413706746\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.37668161434977576,\n\
\ \"acc_stderr\": 0.032521134899291884,\n \"acc_norm\": 0.37668161434977576,\n\
\ \"acc_norm_stderr\": 0.032521134899291884\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.366412213740458,\n \"acc_stderr\": 0.04225875451969638,\n\
\ \"acc_norm\": 0.366412213740458,\n \"acc_norm_stderr\": 0.04225875451969638\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.45454545454545453,\n \"acc_stderr\": 0.045454545454545456,\n \"\
acc_norm\": 0.45454545454545453,\n \"acc_norm_stderr\": 0.045454545454545456\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.04557239513497752,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.04557239513497752\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.4233128834355828,\n \"acc_stderr\": 0.03881891213334382,\n\
\ \"acc_norm\": 0.4233128834355828,\n \"acc_norm_stderr\": 0.03881891213334382\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3482142857142857,\n\
\ \"acc_stderr\": 0.04521829902833585,\n \"acc_norm\": 0.3482142857142857,\n\
\ \"acc_norm_stderr\": 0.04521829902833585\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.3786407766990291,\n \"acc_stderr\": 0.04802694698258973,\n\
\ \"acc_norm\": 0.3786407766990291,\n \"acc_norm_stderr\": 0.04802694698258973\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5213675213675214,\n\
\ \"acc_stderr\": 0.03272616447634955,\n \"acc_norm\": 0.5213675213675214,\n\
\ \"acc_norm_stderr\": 0.03272616447634955\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.40102171136653897,\n\
\ \"acc_stderr\": 0.017526133150124575,\n \"acc_norm\": 0.40102171136653897,\n\
\ \"acc_norm_stderr\": 0.017526133150124575\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.29190751445086704,\n \"acc_stderr\": 0.024476994076247323,\n\
\ \"acc_norm\": 0.29190751445086704,\n \"acc_norm_stderr\": 0.024476994076247323\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2446927374301676,\n\
\ \"acc_stderr\": 0.01437816988409843,\n \"acc_norm\": 0.2446927374301676,\n\
\ \"acc_norm_stderr\": 0.01437816988409843\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.027826109307283693,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.027826109307283693\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.40836012861736337,\n\
\ \"acc_stderr\": 0.02791705074848462,\n \"acc_norm\": 0.40836012861736337,\n\
\ \"acc_norm_stderr\": 0.02791705074848462\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.3395061728395062,\n \"acc_stderr\": 0.026348564412011635,\n\
\ \"acc_norm\": 0.3395061728395062,\n \"acc_norm_stderr\": 0.026348564412011635\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2801418439716312,\n \"acc_stderr\": 0.026789172351140242,\n \
\ \"acc_norm\": 0.2801418439716312,\n \"acc_norm_stderr\": 0.026789172351140242\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.30182529335071706,\n\
\ \"acc_stderr\": 0.01172435051810589,\n \"acc_norm\": 0.30182529335071706,\n\
\ \"acc_norm_stderr\": 0.01172435051810589\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.3713235294117647,\n \"acc_stderr\": 0.02934980313976587,\n\
\ \"acc_norm\": 0.3713235294117647,\n \"acc_norm_stderr\": 0.02934980313976587\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.31699346405228757,\n \"acc_stderr\": 0.01882421951270621,\n \
\ \"acc_norm\": 0.31699346405228757,\n \"acc_norm_stderr\": 0.01882421951270621\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.37272727272727274,\n\
\ \"acc_stderr\": 0.046313813194254635,\n \"acc_norm\": 0.37272727272727274,\n\
\ \"acc_norm_stderr\": 0.046313813194254635\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4204081632653061,\n \"acc_stderr\": 0.03160106993449603,\n\
\ \"acc_norm\": 0.4204081632653061,\n \"acc_norm_stderr\": 0.03160106993449603\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.39303482587064675,\n\
\ \"acc_stderr\": 0.0345368246603156,\n \"acc_norm\": 0.39303482587064675,\n\
\ \"acc_norm_stderr\": 0.0345368246603156\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.41566265060240964,\n\
\ \"acc_stderr\": 0.03836722176598052,\n \"acc_norm\": 0.41566265060240964,\n\
\ \"acc_norm_stderr\": 0.03836722176598052\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.38596491228070173,\n \"acc_stderr\": 0.03733756969066164,\n\
\ \"acc_norm\": 0.38596491228070173,\n \"acc_norm_stderr\": 0.03733756969066164\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2778457772337821,\n\
\ \"mc1_stderr\": 0.01568092936402465,\n \"mc2\": 0.4316529913423131,\n\
\ \"mc2_stderr\": 0.015655577806555094\n }\n}\n```"
repo_url: https://huggingface.co/Writer/palmyra-20b-chat
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|arc:challenge|25_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hellaswag|10_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-08T18-46-04.606475.parquet'
- config_name: results
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- results_2023-10-08T18-46-04.606475.parquet
- split: latest
path:
- results_2023-10-08T18-46-04.606475.parquet
---
# Dataset Card for Evaluation run of Writer/palmyra-20b-chat
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Writer/palmyra-20b-chat
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Writer/palmyra-20b-chat](https://huggingface.co/Writer/palmyra-20b-chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Writer__palmyra-20b-chat",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-08T18:46:04.606475](https://huggingface.co/datasets/open-llm-leaderboard/details_Writer__palmyra-20b-chat/blob/main/results_2023-10-08T18-46-04.606475.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.3560261410296551,
"acc_stderr": 0.03459296547521772,
"acc_norm": 0.3596410100634869,
"acc_norm_stderr": 0.03458685051928871,
"mc1": 0.2778457772337821,
"mc1_stderr": 0.01568092936402465,
"mc2": 0.4316529913423131,
"mc2_stderr": 0.015655577806555094
},
"harness|arc:challenge|25": {
"acc": 0.40102389078498296,
"acc_stderr": 0.014322255790719867,
"acc_norm": 0.4351535836177474,
"acc_norm_stderr": 0.014487986197186047
},
"harness|hellaswag|10": {
"acc": 0.5491933877713603,
"acc_stderr": 0.004965572246803863,
"acc_norm": 0.7283409679346744,
"acc_norm_stderr": 0.004439059440526252
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3851851851851852,
"acc_stderr": 0.042039210401562783,
"acc_norm": 0.3851851851851852,
"acc_norm_stderr": 0.042039210401562783
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.34210526315789475,
"acc_stderr": 0.038607315993160925,
"acc_norm": 0.34210526315789475,
"acc_norm_stderr": 0.038607315993160925
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.37735849056603776,
"acc_stderr": 0.02983280811479601,
"acc_norm": 0.37735849056603776,
"acc_norm_stderr": 0.02983280811479601
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3611111111111111,
"acc_stderr": 0.04016660030451233,
"acc_norm": 0.3611111111111111,
"acc_norm_stderr": 0.04016660030451233
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720683,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720683
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.32947976878612717,
"acc_stderr": 0.03583901754736411,
"acc_norm": 0.32947976878612717,
"acc_norm_stderr": 0.03583901754736411
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.18627450980392157,
"acc_stderr": 0.03873958714149351,
"acc_norm": 0.18627450980392157,
"acc_norm_stderr": 0.03873958714149351
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3276595744680851,
"acc_stderr": 0.030683020843231004,
"acc_norm": 0.3276595744680851,
"acc_norm_stderr": 0.030683020843231004
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.04372748290278007,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.04372748290278007
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.41379310344827586,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.41379310344827586,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.023068188848261114,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.023068188848261114
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.04163453031302859,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.04163453031302859
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3903225806451613,
"acc_stderr": 0.027751256636969576,
"acc_norm": 0.3903225806451613,
"acc_norm_stderr": 0.027751256636969576
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.31527093596059114,
"acc_stderr": 0.03269080871970187,
"acc_norm": 0.31527093596059114,
"acc_norm_stderr": 0.03269080871970187
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2909090909090909,
"acc_stderr": 0.03546563019624336,
"acc_norm": 0.2909090909090909,
"acc_norm_stderr": 0.03546563019624336
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.398989898989899,
"acc_stderr": 0.034889016168527305,
"acc_norm": 0.398989898989899,
"acc_norm_stderr": 0.034889016168527305
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.45077720207253885,
"acc_stderr": 0.03590910952235524,
"acc_norm": 0.45077720207253885,
"acc_norm_stderr": 0.03590910952235524
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3487179487179487,
"acc_stderr": 0.02416278028401772,
"acc_norm": 0.3487179487179487,
"acc_norm_stderr": 0.02416278028401772
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.026067159222275794,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.026067159222275794
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.36134453781512604,
"acc_stderr": 0.031204691225150006,
"acc_norm": 0.36134453781512604,
"acc_norm_stderr": 0.031204691225150006
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.03684881521389023,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.03684881521389023
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.4055045871559633,
"acc_stderr": 0.02105099799189684,
"acc_norm": 0.4055045871559633,
"acc_norm_stderr": 0.02105099799189684
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2916666666666667,
"acc_stderr": 0.03099866630456052,
"acc_norm": 0.2916666666666667,
"acc_norm_stderr": 0.03099866630456052
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.3088235294117647,
"acc_stderr": 0.03242661719827218,
"acc_norm": 0.3088235294117647,
"acc_norm_stderr": 0.03242661719827218
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.4050632911392405,
"acc_stderr": 0.031955147413706746,
"acc_norm": 0.4050632911392405,
"acc_norm_stderr": 0.031955147413706746
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.37668161434977576,
"acc_stderr": 0.032521134899291884,
"acc_norm": 0.37668161434977576,
"acc_norm_stderr": 0.032521134899291884
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.366412213740458,
"acc_stderr": 0.04225875451969638,
"acc_norm": 0.366412213740458,
"acc_norm_stderr": 0.04225875451969638
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.45454545454545453,
"acc_stderr": 0.045454545454545456,
"acc_norm": 0.45454545454545453,
"acc_norm_stderr": 0.045454545454545456
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04557239513497752,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04557239513497752
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4233128834355828,
"acc_stderr": 0.03881891213334382,
"acc_norm": 0.4233128834355828,
"acc_norm_stderr": 0.03881891213334382
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3482142857142857,
"acc_stderr": 0.04521829902833585,
"acc_norm": 0.3482142857142857,
"acc_norm_stderr": 0.04521829902833585
},
"harness|hendrycksTest-management|5": {
"acc": 0.3786407766990291,
"acc_stderr": 0.04802694698258973,
"acc_norm": 0.3786407766990291,
"acc_norm_stderr": 0.04802694698258973
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.5213675213675214,
"acc_stderr": 0.03272616447634955,
"acc_norm": 0.5213675213675214,
"acc_norm_stderr": 0.03272616447634955
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.40102171136653897,
"acc_stderr": 0.017526133150124575,
"acc_norm": 0.40102171136653897,
"acc_norm_stderr": 0.017526133150124575
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.29190751445086704,
"acc_stderr": 0.024476994076247323,
"acc_norm": 0.29190751445086704,
"acc_norm_stderr": 0.024476994076247323
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2446927374301676,
"acc_stderr": 0.01437816988409843,
"acc_norm": 0.2446927374301676,
"acc_norm_stderr": 0.01437816988409843
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.027826109307283693,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.027826109307283693
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.40836012861736337,
"acc_stderr": 0.02791705074848462,
"acc_norm": 0.40836012861736337,
"acc_norm_stderr": 0.02791705074848462
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.3395061728395062,
"acc_stderr": 0.026348564412011635,
"acc_norm": 0.3395061728395062,
"acc_norm_stderr": 0.026348564412011635
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2801418439716312,
"acc_stderr": 0.026789172351140242,
"acc_norm": 0.2801418439716312,
"acc_norm_stderr": 0.026789172351140242
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.30182529335071706,
"acc_stderr": 0.01172435051810589,
"acc_norm": 0.30182529335071706,
"acc_norm_stderr": 0.01172435051810589
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3713235294117647,
"acc_stderr": 0.02934980313976587,
"acc_norm": 0.3713235294117647,
"acc_norm_stderr": 0.02934980313976587
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.31699346405228757,
"acc_stderr": 0.01882421951270621,
"acc_norm": 0.31699346405228757,
"acc_norm_stderr": 0.01882421951270621
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.37272727272727274,
"acc_stderr": 0.046313813194254635,
"acc_norm": 0.37272727272727274,
"acc_norm_stderr": 0.046313813194254635
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4204081632653061,
"acc_stderr": 0.03160106993449603,
"acc_norm": 0.4204081632653061,
"acc_norm_stderr": 0.03160106993449603
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.39303482587064675,
"acc_stderr": 0.0345368246603156,
"acc_norm": 0.39303482587064675,
"acc_norm_stderr": 0.0345368246603156
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-virology|5": {
"acc": 0.41566265060240964,
"acc_stderr": 0.03836722176598052,
"acc_norm": 0.41566265060240964,
"acc_norm_stderr": 0.03836722176598052
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.38596491228070173,
"acc_stderr": 0.03733756969066164,
"acc_norm": 0.38596491228070173,
"acc_norm_stderr": 0.03733756969066164
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2778457772337821,
"mc1_stderr": 0.01568092936402465,
"mc2": 0.4316529913423131,
"mc2_stderr": 0.015655577806555094
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
snirjhar-colab/sanzi | 2023-10-08T19:11:04.000Z | [
"region:us"
] | snirjhar-colab | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_harborwater__wizard-orca-3b | 2023-10-08T19:22:34.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of harborwater/wizard-orca-3b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [harborwater/wizard-orca-3b](https://huggingface.co/harborwater/wizard-orca-3b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_harborwater__wizard-orca-3b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-08T19:21:18.723038](https://huggingface.co/datasets/open-llm-leaderboard/details_harborwater__wizard-orca-3b/blob/main/results_2023-10-08T19-21-18.723038.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2523974583502166,\n\
\ \"acc_stderr\": 0.03129775621975096,\n \"acc_norm\": 0.25584668997848853,\n\
\ \"acc_norm_stderr\": 0.031292926782164415,\n \"mc1\": 0.26193390452876375,\n\
\ \"mc1_stderr\": 0.015392118805015027,\n \"mc2\": 0.40035714074695034,\n\
\ \"mc2_stderr\": 0.014298393496028232\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3848122866894198,\n \"acc_stderr\": 0.014218371065251102,\n\
\ \"acc_norm\": 0.41723549488054607,\n \"acc_norm_stderr\": 0.01440982551840308\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5467038438558056,\n\
\ \"acc_stderr\": 0.00496796581019999,\n \"acc_norm\": 0.7177853017327226,\n\
\ \"acc_norm_stderr\": 0.004491574539441884\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.16,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.16,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.17777777777777778,\n\
\ \"acc_stderr\": 0.033027898599017176,\n \"acc_norm\": 0.17777777777777778,\n\
\ \"acc_norm_stderr\": 0.033027898599017176\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.21710526315789475,\n \"acc_stderr\": 0.03355045304882924,\n\
\ \"acc_norm\": 0.21710526315789475,\n \"acc_norm_stderr\": 0.03355045304882924\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.33,\n\
\ \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \
\ \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2339622641509434,\n \"acc_stderr\": 0.02605529690115292,\n\
\ \"acc_norm\": 0.2339622641509434,\n \"acc_norm_stderr\": 0.02605529690115292\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2023121387283237,\n\
\ \"acc_stderr\": 0.03063114553919882,\n \"acc_norm\": 0.2023121387283237,\n\
\ \"acc_norm_stderr\": 0.03063114553919882\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237655,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237655\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n\
\ \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.31063829787234043,\n \"acc_stderr\": 0.03025123757921317,\n\
\ \"acc_norm\": 0.31063829787234043,\n \"acc_norm_stderr\": 0.03025123757921317\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.21052631578947367,\n\
\ \"acc_stderr\": 0.038351539543994194,\n \"acc_norm\": 0.21052631578947367,\n\
\ \"acc_norm_stderr\": 0.038351539543994194\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2620689655172414,\n \"acc_stderr\": 0.036646663372252565,\n\
\ \"acc_norm\": 0.2620689655172414,\n \"acc_norm_stderr\": 0.036646663372252565\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.21693121693121692,\n \"acc_stderr\": 0.02122708244944508,\n \"\
acc_norm\": 0.21693121693121692,\n \"acc_norm_stderr\": 0.02122708244944508\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.04006168083848876,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.04006168083848876\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.1870967741935484,\n\
\ \"acc_stderr\": 0.02218571009225225,\n \"acc_norm\": 0.1870967741935484,\n\
\ \"acc_norm_stderr\": 0.02218571009225225\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n\
\ \"acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.24242424242424243,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.24242424242424243,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.16666666666666666,\n \"acc_stderr\": 0.02655220782821529,\n \"\
acc_norm\": 0.16666666666666666,\n \"acc_norm_stderr\": 0.02655220782821529\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.20207253886010362,\n \"acc_stderr\": 0.02897908979429673,\n\
\ \"acc_norm\": 0.20207253886010362,\n \"acc_norm_stderr\": 0.02897908979429673\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2205128205128205,\n \"acc_stderr\": 0.021020672680827912,\n\
\ \"acc_norm\": 0.2205128205128205,\n \"acc_norm_stderr\": 0.021020672680827912\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \
\ \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.24789915966386555,\n \"acc_stderr\": 0.028047967224176892,\n\
\ \"acc_norm\": 0.24789915966386555,\n \"acc_norm_stderr\": 0.028047967224176892\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.24503311258278146,\n \"acc_stderr\": 0.03511807571804726,\n \"\
acc_norm\": 0.24503311258278146,\n \"acc_norm_stderr\": 0.03511807571804726\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.21834862385321102,\n \"acc_stderr\": 0.01771260052872273,\n \"\
acc_norm\": 0.21834862385321102,\n \"acc_norm_stderr\": 0.01771260052872273\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.18055555555555555,\n \"acc_stderr\": 0.026232878971491656,\n \"\
acc_norm\": 0.18055555555555555,\n \"acc_norm_stderr\": 0.026232878971491656\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.19117647058823528,\n \"acc_stderr\": 0.027599174300640766,\n \"\
acc_norm\": 0.19117647058823528,\n \"acc_norm_stderr\": 0.027599174300640766\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2911392405063291,\n \"acc_stderr\": 0.029571601065753374,\n \
\ \"acc_norm\": 0.2911392405063291,\n \"acc_norm_stderr\": 0.029571601065753374\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.336322869955157,\n\
\ \"acc_stderr\": 0.031708824268455,\n \"acc_norm\": 0.336322869955157,\n\
\ \"acc_norm_stderr\": 0.031708824268455\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2824427480916031,\n \"acc_stderr\": 0.03948406125768361,\n\
\ \"acc_norm\": 0.2824427480916031,\n \"acc_norm_stderr\": 0.03948406125768361\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.23140495867768596,\n \"acc_stderr\": 0.03849856098794089,\n \"\
acc_norm\": 0.23140495867768596,\n \"acc_norm_stderr\": 0.03849856098794089\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.24074074074074073,\n\
\ \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.24074074074074073,\n\
\ \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2147239263803681,\n \"acc_stderr\": 0.03226219377286774,\n\
\ \"acc_norm\": 0.2147239263803681,\n \"acc_norm_stderr\": 0.03226219377286774\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n\
\ \"acc_stderr\": 0.04364226155841043,\n \"acc_norm\": 0.30357142857142855,\n\
\ \"acc_norm_stderr\": 0.04364226155841043\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.13592233009708737,\n \"acc_stderr\": 0.03393295729761013,\n\
\ \"acc_norm\": 0.13592233009708737,\n \"acc_norm_stderr\": 0.03393295729761013\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.3076923076923077,\n\
\ \"acc_stderr\": 0.030236389942173116,\n \"acc_norm\": 0.3076923076923077,\n\
\ \"acc_norm_stderr\": 0.030236389942173116\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542129,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542129\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.24265644955300128,\n\
\ \"acc_stderr\": 0.01532988894089987,\n \"acc_norm\": 0.24265644955300128,\n\
\ \"acc_norm_stderr\": 0.01532988894089987\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.25722543352601157,\n \"acc_stderr\": 0.02353292543104428,\n\
\ \"acc_norm\": 0.25722543352601157,\n \"acc_norm_stderr\": 0.02353292543104428\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23687150837988827,\n\
\ \"acc_stderr\": 0.014219570788103982,\n \"acc_norm\": 0.23687150837988827,\n\
\ \"acc_norm_stderr\": 0.014219570788103982\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.02355083135199509,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.02355083135199509\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.27009646302250806,\n\
\ \"acc_stderr\": 0.02521804037341063,\n \"acc_norm\": 0.27009646302250806,\n\
\ \"acc_norm_stderr\": 0.02521804037341063\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.24074074074074073,\n \"acc_stderr\": 0.023788583551658533,\n\
\ \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.023788583551658533\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2730496453900709,\n \"acc_stderr\": 0.026577860943307857,\n \
\ \"acc_norm\": 0.2730496453900709,\n \"acc_norm_stderr\": 0.026577860943307857\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2392438070404172,\n\
\ \"acc_stderr\": 0.01089612365267665,\n \"acc_norm\": 0.2392438070404172,\n\
\ \"acc_norm_stderr\": 0.01089612365267665\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n\
\ \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25163398692810457,\n \"acc_stderr\": 0.017555818091322263,\n \
\ \"acc_norm\": 0.25163398692810457,\n \"acc_norm_stderr\": 0.017555818091322263\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2636363636363636,\n\
\ \"acc_stderr\": 0.04220224692971987,\n \"acc_norm\": 0.2636363636363636,\n\
\ \"acc_norm_stderr\": 0.04220224692971987\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.21224489795918366,\n \"acc_stderr\": 0.026176967197866767,\n\
\ \"acc_norm\": 0.21224489795918366,\n \"acc_norm_stderr\": 0.026176967197866767\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n\
\ \"acc_stderr\": 0.030360490154014652,\n \"acc_norm\": 0.24378109452736318,\n\
\ \"acc_norm_stderr\": 0.030360490154014652\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3132530120481928,\n\
\ \"acc_stderr\": 0.03610805018031023,\n \"acc_norm\": 0.3132530120481928,\n\
\ \"acc_norm_stderr\": 0.03610805018031023\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.3216374269005848,\n \"acc_stderr\": 0.03582529442573122,\n\
\ \"acc_norm\": 0.3216374269005848,\n \"acc_norm_stderr\": 0.03582529442573122\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.26193390452876375,\n\
\ \"mc1_stderr\": 0.015392118805015027,\n \"mc2\": 0.40035714074695034,\n\
\ \"mc2_stderr\": 0.014298393496028232\n }\n}\n```"
repo_url: https://huggingface.co/harborwater/wizard-orca-3b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_08T19_21_18.723038
path:
- '**/details_harness|arc:challenge|25_2023-10-08T19-21-18.723038.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-08T19-21-18.723038.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_08T19_21_18.723038
path:
- '**/details_harness|hellaswag|10_2023-10-08T19-21-18.723038.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-08T19-21-18.723038.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_08T19_21_18.723038
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T19-21-18.723038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T19-21-18.723038.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T19-21-18.723038.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_08T19_21_18.723038
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T19-21-18.723038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T19-21-18.723038.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_08T19_21_18.723038
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T19-21-18.723038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T19-21-18.723038.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_08T19_21_18.723038
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T19-21-18.723038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T19-21-18.723038.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_08T19_21_18.723038
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T19-21-18.723038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T19-21-18.723038.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_08T19_21_18.723038
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T19-21-18.723038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T19-21-18.723038.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_08T19_21_18.723038
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T19-21-18.723038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T19-21-18.723038.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_08T19_21_18.723038
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T19-21-18.723038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T19-21-18.723038.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_08T19_21_18.723038
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T19-21-18.723038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T19-21-18.723038.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_08T19_21_18.723038
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T19-21-18.723038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T19-21-18.723038.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_08T19_21_18.723038
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T19-21-18.723038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T19-21-18.723038.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_08T19_21_18.723038
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T19-21-18.723038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T19-21-18.723038.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_08T19_21_18.723038
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T19-21-18.723038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T19-21-18.723038.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_08T19_21_18.723038
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T19-21-18.723038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T19-21-18.723038.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_08T19_21_18.723038
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T19-21-18.723038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T19-21-18.723038.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_08T19_21_18.723038
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T19-21-18.723038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T19-21-18.723038.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_08T19_21_18.723038
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T19-21-18.723038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T19-21-18.723038.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_08T19_21_18.723038
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T19-21-18.723038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T19-21-18.723038.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_08T19_21_18.723038
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T19-21-18.723038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T19-21-18.723038.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_08T19_21_18.723038
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T19-21-18.723038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T19-21-18.723038.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_08T19_21_18.723038
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T19-21-18.723038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T19-21-18.723038.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_08T19_21_18.723038
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T19-21-18.723038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T19-21-18.723038.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_08T19_21_18.723038
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T19-21-18.723038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T19-21-18.723038.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_08T19_21_18.723038
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T19-21-18.723038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T19-21-18.723038.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_08T19_21_18.723038
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T19-21-18.723038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T19-21-18.723038.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_08T19_21_18.723038
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T19-21-18.723038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T19-21-18.723038.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_08T19_21_18.723038
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T19-21-18.723038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T19-21-18.723038.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_08T19_21_18.723038
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T19-21-18.723038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T19-21-18.723038.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_08T19_21_18.723038
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T19-21-18.723038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T19-21-18.723038.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_08T19_21_18.723038
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T19-21-18.723038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T19-21-18.723038.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_08T19_21_18.723038
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T19-21-18.723038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T19-21-18.723038.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_08T19_21_18.723038
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T19-21-18.723038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T19-21-18.723038.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_08T19_21_18.723038
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T19-21-18.723038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T19-21-18.723038.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_08T19_21_18.723038
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T19-21-18.723038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T19-21-18.723038.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_08T19_21_18.723038
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T19-21-18.723038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T19-21-18.723038.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_08T19_21_18.723038
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T19-21-18.723038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T19-21-18.723038.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_08T19_21_18.723038
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T19-21-18.723038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T19-21-18.723038.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_08T19_21_18.723038
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T19-21-18.723038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T19-21-18.723038.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_08T19_21_18.723038
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T19-21-18.723038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T19-21-18.723038.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_08T19_21_18.723038
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-08T19-21-18.723038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-08T19-21-18.723038.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_08T19_21_18.723038
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T19-21-18.723038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T19-21-18.723038.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_08T19_21_18.723038
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T19-21-18.723038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T19-21-18.723038.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_08T19_21_18.723038
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T19-21-18.723038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T19-21-18.723038.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_08T19_21_18.723038
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T19-21-18.723038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T19-21-18.723038.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_08T19_21_18.723038
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T19-21-18.723038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T19-21-18.723038.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_08T19_21_18.723038
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T19-21-18.723038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T19-21-18.723038.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_08T19_21_18.723038
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T19-21-18.723038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T19-21-18.723038.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_08T19_21_18.723038
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T19-21-18.723038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T19-21-18.723038.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_08T19_21_18.723038
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T19-21-18.723038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T19-21-18.723038.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_08T19_21_18.723038
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T19-21-18.723038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T19-21-18.723038.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_08T19_21_18.723038
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T19-21-18.723038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T19-21-18.723038.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_08T19_21_18.723038
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T19-21-18.723038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T19-21-18.723038.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_08T19_21_18.723038
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T19-21-18.723038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T19-21-18.723038.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_08T19_21_18.723038
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T19-21-18.723038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T19-21-18.723038.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_08T19_21_18.723038
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T19-21-18.723038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T19-21-18.723038.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_08T19_21_18.723038
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T19-21-18.723038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T19-21-18.723038.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_08T19_21_18.723038
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T19-21-18.723038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T19-21-18.723038.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_08T19_21_18.723038
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T19-21-18.723038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T19-21-18.723038.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_08T19_21_18.723038
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-08T19-21-18.723038.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-08T19-21-18.723038.parquet'
- config_name: results
data_files:
- split: 2023_10_08T19_21_18.723038
path:
- results_2023-10-08T19-21-18.723038.parquet
- split: latest
path:
- results_2023-10-08T19-21-18.723038.parquet
---
# Dataset Card for Evaluation run of harborwater/wizard-orca-3b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/harborwater/wizard-orca-3b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [harborwater/wizard-orca-3b](https://huggingface.co/harborwater/wizard-orca-3b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_harborwater__wizard-orca-3b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-08T19:21:18.723038](https://huggingface.co/datasets/open-llm-leaderboard/details_harborwater__wizard-orca-3b/blob/main/results_2023-10-08T19-21-18.723038.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2523974583502166,
"acc_stderr": 0.03129775621975096,
"acc_norm": 0.25584668997848853,
"acc_norm_stderr": 0.031292926782164415,
"mc1": 0.26193390452876375,
"mc1_stderr": 0.015392118805015027,
"mc2": 0.40035714074695034,
"mc2_stderr": 0.014298393496028232
},
"harness|arc:challenge|25": {
"acc": 0.3848122866894198,
"acc_stderr": 0.014218371065251102,
"acc_norm": 0.41723549488054607,
"acc_norm_stderr": 0.01440982551840308
},
"harness|hellaswag|10": {
"acc": 0.5467038438558056,
"acc_stderr": 0.00496796581019999,
"acc_norm": 0.7177853017327226,
"acc_norm_stderr": 0.004491574539441884
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.16,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.16,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.17777777777777778,
"acc_stderr": 0.033027898599017176,
"acc_norm": 0.17777777777777778,
"acc_norm_stderr": 0.033027898599017176
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.21710526315789475,
"acc_stderr": 0.03355045304882924,
"acc_norm": 0.21710526315789475,
"acc_norm_stderr": 0.03355045304882924
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2339622641509434,
"acc_stderr": 0.02605529690115292,
"acc_norm": 0.2339622641509434,
"acc_norm_stderr": 0.02605529690115292
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.25,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2023121387283237,
"acc_stderr": 0.03063114553919882,
"acc_norm": 0.2023121387283237,
"acc_norm_stderr": 0.03063114553919882
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237655,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237655
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.31063829787234043,
"acc_stderr": 0.03025123757921317,
"acc_norm": 0.31063829787234043,
"acc_norm_stderr": 0.03025123757921317
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.038351539543994194,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.038351539543994194
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2620689655172414,
"acc_stderr": 0.036646663372252565,
"acc_norm": 0.2620689655172414,
"acc_norm_stderr": 0.036646663372252565
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.21693121693121692,
"acc_stderr": 0.02122708244944508,
"acc_norm": 0.21693121693121692,
"acc_norm_stderr": 0.02122708244944508
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.04006168083848876,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.04006168083848876
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1870967741935484,
"acc_stderr": 0.02218571009225225,
"acc_norm": 0.1870967741935484,
"acc_norm_stderr": 0.02218571009225225
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15270935960591134,
"acc_stderr": 0.02530890453938063,
"acc_norm": 0.15270935960591134,
"acc_norm_stderr": 0.02530890453938063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24242424242424243,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.24242424242424243,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.16666666666666666,
"acc_stderr": 0.02655220782821529,
"acc_norm": 0.16666666666666666,
"acc_norm_stderr": 0.02655220782821529
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.20207253886010362,
"acc_stderr": 0.02897908979429673,
"acc_norm": 0.20207253886010362,
"acc_norm_stderr": 0.02897908979429673
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2205128205128205,
"acc_stderr": 0.021020672680827912,
"acc_norm": 0.2205128205128205,
"acc_norm_stderr": 0.021020672680827912
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.024882116857655075,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.024882116857655075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.24789915966386555,
"acc_stderr": 0.028047967224176892,
"acc_norm": 0.24789915966386555,
"acc_norm_stderr": 0.028047967224176892
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.24503311258278146,
"acc_stderr": 0.03511807571804726,
"acc_norm": 0.24503311258278146,
"acc_norm_stderr": 0.03511807571804726
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.21834862385321102,
"acc_stderr": 0.01771260052872273,
"acc_norm": 0.21834862385321102,
"acc_norm_stderr": 0.01771260052872273
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.18055555555555555,
"acc_stderr": 0.026232878971491656,
"acc_norm": 0.18055555555555555,
"acc_norm_stderr": 0.026232878971491656
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.19117647058823528,
"acc_stderr": 0.027599174300640766,
"acc_norm": 0.19117647058823528,
"acc_norm_stderr": 0.027599174300640766
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2911392405063291,
"acc_stderr": 0.029571601065753374,
"acc_norm": 0.2911392405063291,
"acc_norm_stderr": 0.029571601065753374
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.336322869955157,
"acc_stderr": 0.031708824268455,
"acc_norm": 0.336322869955157,
"acc_norm_stderr": 0.031708824268455
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2824427480916031,
"acc_stderr": 0.03948406125768361,
"acc_norm": 0.2824427480916031,
"acc_norm_stderr": 0.03948406125768361
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.23140495867768596,
"acc_stderr": 0.03849856098794089,
"acc_norm": 0.23140495867768596,
"acc_norm_stderr": 0.03849856098794089
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2147239263803681,
"acc_stderr": 0.03226219377286774,
"acc_norm": 0.2147239263803681,
"acc_norm_stderr": 0.03226219377286774
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.30357142857142855,
"acc_stderr": 0.04364226155841043,
"acc_norm": 0.30357142857142855,
"acc_norm_stderr": 0.04364226155841043
},
"harness|hendrycksTest-management|5": {
"acc": 0.13592233009708737,
"acc_stderr": 0.03393295729761013,
"acc_norm": 0.13592233009708737,
"acc_norm_stderr": 0.03393295729761013
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.3076923076923077,
"acc_stderr": 0.030236389942173116,
"acc_norm": 0.3076923076923077,
"acc_norm_stderr": 0.030236389942173116
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.24265644955300128,
"acc_stderr": 0.01532988894089987,
"acc_norm": 0.24265644955300128,
"acc_norm_stderr": 0.01532988894089987
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.25722543352601157,
"acc_stderr": 0.02353292543104428,
"acc_norm": 0.25722543352601157,
"acc_norm_stderr": 0.02353292543104428
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23687150837988827,
"acc_stderr": 0.014219570788103982,
"acc_norm": 0.23687150837988827,
"acc_norm_stderr": 0.014219570788103982
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.02355083135199509,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.02355083135199509
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.27009646302250806,
"acc_stderr": 0.02521804037341063,
"acc_norm": 0.27009646302250806,
"acc_norm_stderr": 0.02521804037341063
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.023788583551658533,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.023788583551658533
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2730496453900709,
"acc_stderr": 0.026577860943307857,
"acc_norm": 0.2730496453900709,
"acc_norm_stderr": 0.026577860943307857
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2392438070404172,
"acc_stderr": 0.01089612365267665,
"acc_norm": 0.2392438070404172,
"acc_norm_stderr": 0.01089612365267665
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25163398692810457,
"acc_stderr": 0.017555818091322263,
"acc_norm": 0.25163398692810457,
"acc_norm_stderr": 0.017555818091322263
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2636363636363636,
"acc_stderr": 0.04220224692971987,
"acc_norm": 0.2636363636363636,
"acc_norm_stderr": 0.04220224692971987
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.21224489795918366,
"acc_stderr": 0.026176967197866767,
"acc_norm": 0.21224489795918366,
"acc_norm_stderr": 0.026176967197866767
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.030360490154014652,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.030360490154014652
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3132530120481928,
"acc_stderr": 0.03610805018031023,
"acc_norm": 0.3132530120481928,
"acc_norm_stderr": 0.03610805018031023
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.26193390452876375,
"mc1_stderr": 0.015392118805015027,
"mc2": 0.40035714074695034,
"mc2_stderr": 0.014298393496028232
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
toninhodjj/morgan | 2023-10-08T19:27:02.000Z | [
"region:us"
] | toninhodjj | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_chargoddard__duplicitous-slurpbeast-13b | 2023-10-08T19:37:12.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of chargoddard/duplicitous-slurpbeast-13b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [chargoddard/duplicitous-slurpbeast-13b](https://huggingface.co/chargoddard/duplicitous-slurpbeast-13b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_chargoddard__duplicitous-slurpbeast-13b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-08T19:35:50.428127](https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__duplicitous-slurpbeast-13b/blob/main/results_2023-10-08T19-35-50.428127.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5768492492689764,\n\
\ \"acc_stderr\": 0.03432049836847268,\n \"acc_norm\": 0.580548316645243,\n\
\ \"acc_norm_stderr\": 0.03429892446745067,\n \"mc1\": 0.36107711138310894,\n\
\ \"mc1_stderr\": 0.016814312844836886,\n \"mc2\": 0.5232823914450195,\n\
\ \"mc2_stderr\": 0.015871876766326714\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5947098976109215,\n \"acc_stderr\": 0.014346869060229321,\n\
\ \"acc_norm\": 0.621160409556314,\n \"acc_norm_stderr\": 0.01417591549000032\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6473809998008365,\n\
\ \"acc_stderr\": 0.004768088918512182,\n \"acc_norm\": 0.8391754630551683,\n\
\ \"acc_norm_stderr\": 0.0036661823284423432\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n\
\ \"acc_stderr\": 0.04313531696750574,\n \"acc_norm\": 0.4740740740740741,\n\
\ \"acc_norm_stderr\": 0.04313531696750574\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5263157894736842,\n \"acc_stderr\": 0.04063302731486671,\n\
\ \"acc_norm\": 0.5263157894736842,\n \"acc_norm_stderr\": 0.04063302731486671\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.51,\n\
\ \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5962264150943396,\n \"acc_stderr\": 0.03019761160019795,\n\
\ \"acc_norm\": 0.5962264150943396,\n \"acc_norm_stderr\": 0.03019761160019795\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6388888888888888,\n\
\ \"acc_stderr\": 0.04016660030451233,\n \"acc_norm\": 0.6388888888888888,\n\
\ \"acc_norm_stderr\": 0.04016660030451233\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5260115606936416,\n\
\ \"acc_stderr\": 0.03807301726504513,\n \"acc_norm\": 0.5260115606936416,\n\
\ \"acc_norm_stderr\": 0.03807301726504513\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.043898699568087764,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.043898699568087764\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4765957446808511,\n \"acc_stderr\": 0.03265019475033582,\n\
\ \"acc_norm\": 0.4765957446808511,\n \"acc_norm_stderr\": 0.03265019475033582\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.041546596717075474,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.041546596717075474\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.35714285714285715,\n \"acc_stderr\": 0.024677862841332786,\n \"\
acc_norm\": 0.35714285714285715,\n \"acc_norm_stderr\": 0.024677862841332786\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6483870967741936,\n\
\ \"acc_stderr\": 0.02716253782694846,\n \"acc_norm\": 0.6483870967741936,\n\
\ \"acc_norm_stderr\": 0.02716253782694846\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.45320197044334976,\n \"acc_stderr\": 0.03502544650845872,\n\
\ \"acc_norm\": 0.45320197044334976,\n \"acc_norm_stderr\": 0.03502544650845872\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.703030303030303,\n \"acc_stderr\": 0.03567969772268049,\n\
\ \"acc_norm\": 0.703030303030303,\n \"acc_norm_stderr\": 0.03567969772268049\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7121212121212122,\n \"acc_stderr\": 0.03225883512300993,\n \"\
acc_norm\": 0.7121212121212122,\n \"acc_norm_stderr\": 0.03225883512300993\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8393782383419689,\n \"acc_stderr\": 0.026499057701397436,\n\
\ \"acc_norm\": 0.8393782383419689,\n \"acc_norm_stderr\": 0.026499057701397436\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5487179487179488,\n \"acc_stderr\": 0.025230381238934833,\n\
\ \"acc_norm\": 0.5487179487179488,\n \"acc_norm_stderr\": 0.025230381238934833\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3111111111111111,\n \"acc_stderr\": 0.02822644674968352,\n \
\ \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.02822644674968352\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.032252942323996406,\n\
\ \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.032252942323996406\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.39072847682119205,\n \"acc_stderr\": 0.039837983066598075,\n \"\
acc_norm\": 0.39072847682119205,\n \"acc_norm_stderr\": 0.039837983066598075\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7688073394495413,\n \"acc_stderr\": 0.018075750241633142,\n \"\
acc_norm\": 0.7688073394495413,\n \"acc_norm_stderr\": 0.018075750241633142\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4537037037037037,\n \"acc_stderr\": 0.03395322726375797,\n \"\
acc_norm\": 0.4537037037037037,\n \"acc_norm_stderr\": 0.03395322726375797\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7990196078431373,\n \"acc_stderr\": 0.02812597226565437,\n \"\
acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.02812597226565437\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229966,\n \
\ \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229966\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6030534351145038,\n \"acc_stderr\": 0.04291135671009224,\n\
\ \"acc_norm\": 0.6030534351145038,\n \"acc_norm_stderr\": 0.04291135671009224\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7355371900826446,\n \"acc_stderr\": 0.040261875275912073,\n \"\
acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.040261875275912073\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.042365112580946315,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.042365112580946315\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6441717791411042,\n \"acc_stderr\": 0.03761521380046734,\n\
\ \"acc_norm\": 0.6441717791411042,\n \"acc_norm_stderr\": 0.03761521380046734\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6893203883495146,\n \"acc_stderr\": 0.0458212416016155,\n\
\ \"acc_norm\": 0.6893203883495146,\n \"acc_norm_stderr\": 0.0458212416016155\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8205128205128205,\n\
\ \"acc_stderr\": 0.025140935950335445,\n \"acc_norm\": 0.8205128205128205,\n\
\ \"acc_norm_stderr\": 0.025140935950335445\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.014866821664709581,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.014866821664709581\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.025522474632121615,\n\
\ \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.025522474632121615\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.48268156424581005,\n\
\ \"acc_stderr\": 0.01671246744170252,\n \"acc_norm\": 0.48268156424581005,\n\
\ \"acc_norm_stderr\": 0.01671246744170252\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6143790849673203,\n \"acc_stderr\": 0.02787074527829028,\n\
\ \"acc_norm\": 0.6143790849673203,\n \"acc_norm_stderr\": 0.02787074527829028\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6463022508038585,\n\
\ \"acc_stderr\": 0.027155208103200865,\n \"acc_norm\": 0.6463022508038585,\n\
\ \"acc_norm_stderr\": 0.027155208103200865\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.654320987654321,\n \"acc_stderr\": 0.02646248777700187,\n\
\ \"acc_norm\": 0.654320987654321,\n \"acc_norm_stderr\": 0.02646248777700187\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.425531914893617,\n \"acc_stderr\": 0.02949482760014437,\n \
\ \"acc_norm\": 0.425531914893617,\n \"acc_norm_stderr\": 0.02949482760014437\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45045632333767927,\n\
\ \"acc_stderr\": 0.012707390438502348,\n \"acc_norm\": 0.45045632333767927,\n\
\ \"acc_norm_stderr\": 0.012707390438502348\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5514705882352942,\n \"acc_stderr\": 0.030211479609121596,\n\
\ \"acc_norm\": 0.5514705882352942,\n \"acc_norm_stderr\": 0.030211479609121596\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5898692810457516,\n \"acc_stderr\": 0.019898412717635896,\n \
\ \"acc_norm\": 0.5898692810457516,\n \"acc_norm_stderr\": 0.019898412717635896\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6612244897959184,\n \"acc_stderr\": 0.030299506562154185,\n\
\ \"acc_norm\": 0.6612244897959184,\n \"acc_norm_stderr\": 0.030299506562154185\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7213930348258707,\n\
\ \"acc_stderr\": 0.031700561834973086,\n \"acc_norm\": 0.7213930348258707,\n\
\ \"acc_norm_stderr\": 0.031700561834973086\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.46987951807228917,\n\
\ \"acc_stderr\": 0.03885425420866766,\n \"acc_norm\": 0.46987951807228917,\n\
\ \"acc_norm_stderr\": 0.03885425420866766\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.03126781714663179,\n\
\ \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.03126781714663179\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.36107711138310894,\n\
\ \"mc1_stderr\": 0.016814312844836886,\n \"mc2\": 0.5232823914450195,\n\
\ \"mc2_stderr\": 0.015871876766326714\n }\n}\n```"
repo_url: https://huggingface.co/chargoddard/duplicitous-slurpbeast-13b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_08T19_35_50.428127
path:
- '**/details_harness|arc:challenge|25_2023-10-08T19-35-50.428127.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-08T19-35-50.428127.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_08T19_35_50.428127
path:
- '**/details_harness|hellaswag|10_2023-10-08T19-35-50.428127.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-08T19-35-50.428127.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_08T19_35_50.428127
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T19-35-50.428127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T19-35-50.428127.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T19-35-50.428127.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_08T19_35_50.428127
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T19-35-50.428127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T19-35-50.428127.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_08T19_35_50.428127
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T19-35-50.428127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T19-35-50.428127.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_08T19_35_50.428127
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T19-35-50.428127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T19-35-50.428127.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_08T19_35_50.428127
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T19-35-50.428127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T19-35-50.428127.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_08T19_35_50.428127
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T19-35-50.428127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T19-35-50.428127.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_08T19_35_50.428127
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T19-35-50.428127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T19-35-50.428127.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_08T19_35_50.428127
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T19-35-50.428127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T19-35-50.428127.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_08T19_35_50.428127
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T19-35-50.428127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T19-35-50.428127.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_08T19_35_50.428127
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T19-35-50.428127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T19-35-50.428127.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_08T19_35_50.428127
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T19-35-50.428127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T19-35-50.428127.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_08T19_35_50.428127
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T19-35-50.428127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T19-35-50.428127.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_08T19_35_50.428127
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T19-35-50.428127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T19-35-50.428127.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_08T19_35_50.428127
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T19-35-50.428127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T19-35-50.428127.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_08T19_35_50.428127
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T19-35-50.428127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T19-35-50.428127.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_08T19_35_50.428127
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T19-35-50.428127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T19-35-50.428127.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_08T19_35_50.428127
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T19-35-50.428127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T19-35-50.428127.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_08T19_35_50.428127
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T19-35-50.428127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T19-35-50.428127.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_08T19_35_50.428127
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T19-35-50.428127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T19-35-50.428127.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_08T19_35_50.428127
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T19-35-50.428127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T19-35-50.428127.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_08T19_35_50.428127
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T19-35-50.428127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T19-35-50.428127.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_08T19_35_50.428127
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T19-35-50.428127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T19-35-50.428127.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_08T19_35_50.428127
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T19-35-50.428127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T19-35-50.428127.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_08T19_35_50.428127
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T19-35-50.428127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T19-35-50.428127.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_08T19_35_50.428127
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T19-35-50.428127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T19-35-50.428127.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_08T19_35_50.428127
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T19-35-50.428127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T19-35-50.428127.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_08T19_35_50.428127
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T19-35-50.428127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T19-35-50.428127.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_08T19_35_50.428127
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T19-35-50.428127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T19-35-50.428127.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_08T19_35_50.428127
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T19-35-50.428127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T19-35-50.428127.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_08T19_35_50.428127
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T19-35-50.428127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T19-35-50.428127.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_08T19_35_50.428127
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T19-35-50.428127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T19-35-50.428127.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_08T19_35_50.428127
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T19-35-50.428127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T19-35-50.428127.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_08T19_35_50.428127
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T19-35-50.428127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T19-35-50.428127.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_08T19_35_50.428127
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T19-35-50.428127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T19-35-50.428127.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_08T19_35_50.428127
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T19-35-50.428127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T19-35-50.428127.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_08T19_35_50.428127
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T19-35-50.428127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T19-35-50.428127.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_08T19_35_50.428127
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T19-35-50.428127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T19-35-50.428127.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_08T19_35_50.428127
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T19-35-50.428127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T19-35-50.428127.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_08T19_35_50.428127
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T19-35-50.428127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T19-35-50.428127.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_08T19_35_50.428127
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-08T19-35-50.428127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-08T19-35-50.428127.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_08T19_35_50.428127
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T19-35-50.428127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T19-35-50.428127.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_08T19_35_50.428127
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T19-35-50.428127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T19-35-50.428127.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_08T19_35_50.428127
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T19-35-50.428127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T19-35-50.428127.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_08T19_35_50.428127
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T19-35-50.428127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T19-35-50.428127.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_08T19_35_50.428127
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T19-35-50.428127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T19-35-50.428127.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_08T19_35_50.428127
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T19-35-50.428127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T19-35-50.428127.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_08T19_35_50.428127
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T19-35-50.428127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T19-35-50.428127.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_08T19_35_50.428127
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T19-35-50.428127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T19-35-50.428127.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_08T19_35_50.428127
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T19-35-50.428127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T19-35-50.428127.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_08T19_35_50.428127
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T19-35-50.428127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T19-35-50.428127.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_08T19_35_50.428127
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T19-35-50.428127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T19-35-50.428127.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_08T19_35_50.428127
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T19-35-50.428127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T19-35-50.428127.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_08T19_35_50.428127
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T19-35-50.428127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T19-35-50.428127.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_08T19_35_50.428127
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T19-35-50.428127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T19-35-50.428127.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_08T19_35_50.428127
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T19-35-50.428127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T19-35-50.428127.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_08T19_35_50.428127
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T19-35-50.428127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T19-35-50.428127.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_08T19_35_50.428127
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T19-35-50.428127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T19-35-50.428127.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_08T19_35_50.428127
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T19-35-50.428127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T19-35-50.428127.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_08T19_35_50.428127
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-08T19-35-50.428127.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-08T19-35-50.428127.parquet'
- config_name: results
data_files:
- split: 2023_10_08T19_35_50.428127
path:
- results_2023-10-08T19-35-50.428127.parquet
- split: latest
path:
- results_2023-10-08T19-35-50.428127.parquet
---
# Dataset Card for Evaluation run of chargoddard/duplicitous-slurpbeast-13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/chargoddard/duplicitous-slurpbeast-13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [chargoddard/duplicitous-slurpbeast-13b](https://huggingface.co/chargoddard/duplicitous-slurpbeast-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_chargoddard__duplicitous-slurpbeast-13b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-08T19:35:50.428127](https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__duplicitous-slurpbeast-13b/blob/main/results_2023-10-08T19-35-50.428127.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5768492492689764,
"acc_stderr": 0.03432049836847268,
"acc_norm": 0.580548316645243,
"acc_norm_stderr": 0.03429892446745067,
"mc1": 0.36107711138310894,
"mc1_stderr": 0.016814312844836886,
"mc2": 0.5232823914450195,
"mc2_stderr": 0.015871876766326714
},
"harness|arc:challenge|25": {
"acc": 0.5947098976109215,
"acc_stderr": 0.014346869060229321,
"acc_norm": 0.621160409556314,
"acc_norm_stderr": 0.01417591549000032
},
"harness|hellaswag|10": {
"acc": 0.6473809998008365,
"acc_stderr": 0.004768088918512182,
"acc_norm": 0.8391754630551683,
"acc_norm_stderr": 0.0036661823284423432
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4740740740740741,
"acc_stderr": 0.04313531696750574,
"acc_norm": 0.4740740740740741,
"acc_norm_stderr": 0.04313531696750574
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5263157894736842,
"acc_stderr": 0.04063302731486671,
"acc_norm": 0.5263157894736842,
"acc_norm_stderr": 0.04063302731486671
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5962264150943396,
"acc_stderr": 0.03019761160019795,
"acc_norm": 0.5962264150943396,
"acc_norm_stderr": 0.03019761160019795
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.04016660030451233,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.04016660030451233
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5260115606936416,
"acc_stderr": 0.03807301726504513,
"acc_norm": 0.5260115606936416,
"acc_norm_stderr": 0.03807301726504513
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.043898699568087764,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.043898699568087764
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4765957446808511,
"acc_stderr": 0.03265019475033582,
"acc_norm": 0.4765957446808511,
"acc_norm_stderr": 0.03265019475033582
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159394,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159394
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.041546596717075474,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.041546596717075474
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.024677862841332786,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.024677862841332786
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6483870967741936,
"acc_stderr": 0.02716253782694846,
"acc_norm": 0.6483870967741936,
"acc_norm_stderr": 0.02716253782694846
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.45320197044334976,
"acc_stderr": 0.03502544650845872,
"acc_norm": 0.45320197044334976,
"acc_norm_stderr": 0.03502544650845872
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.703030303030303,
"acc_stderr": 0.03567969772268049,
"acc_norm": 0.703030303030303,
"acc_norm_stderr": 0.03567969772268049
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7121212121212122,
"acc_stderr": 0.03225883512300993,
"acc_norm": 0.7121212121212122,
"acc_norm_stderr": 0.03225883512300993
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8393782383419689,
"acc_stderr": 0.026499057701397436,
"acc_norm": 0.8393782383419689,
"acc_norm_stderr": 0.026499057701397436
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5487179487179488,
"acc_stderr": 0.025230381238934833,
"acc_norm": 0.5487179487179488,
"acc_norm_stderr": 0.025230381238934833
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.02822644674968352,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.02822644674968352
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.032252942323996406,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.032252942323996406
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.39072847682119205,
"acc_stderr": 0.039837983066598075,
"acc_norm": 0.39072847682119205,
"acc_norm_stderr": 0.039837983066598075
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7688073394495413,
"acc_stderr": 0.018075750241633142,
"acc_norm": 0.7688073394495413,
"acc_norm_stderr": 0.018075750241633142
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4537037037037037,
"acc_stderr": 0.03395322726375797,
"acc_norm": 0.4537037037037037,
"acc_norm_stderr": 0.03395322726375797
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.02812597226565437,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.02812597226565437
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7763713080168776,
"acc_stderr": 0.027123298205229966,
"acc_norm": 0.7763713080168776,
"acc_norm_stderr": 0.027123298205229966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6030534351145038,
"acc_stderr": 0.04291135671009224,
"acc_norm": 0.6030534351145038,
"acc_norm_stderr": 0.04291135671009224
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.040261875275912073,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.040261875275912073
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.042365112580946315,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.042365112580946315
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6441717791411042,
"acc_stderr": 0.03761521380046734,
"acc_norm": 0.6441717791411042,
"acc_norm_stderr": 0.03761521380046734
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.6893203883495146,
"acc_stderr": 0.0458212416016155,
"acc_norm": 0.6893203883495146,
"acc_norm_stderr": 0.0458212416016155
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8205128205128205,
"acc_stderr": 0.025140935950335445,
"acc_norm": 0.8205128205128205,
"acc_norm_stderr": 0.025140935950335445
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.014866821664709581,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.014866821664709581
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.025522474632121615,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.025522474632121615
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.48268156424581005,
"acc_stderr": 0.01671246744170252,
"acc_norm": 0.48268156424581005,
"acc_norm_stderr": 0.01671246744170252
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6143790849673203,
"acc_stderr": 0.02787074527829028,
"acc_norm": 0.6143790849673203,
"acc_norm_stderr": 0.02787074527829028
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6463022508038585,
"acc_stderr": 0.027155208103200865,
"acc_norm": 0.6463022508038585,
"acc_norm_stderr": 0.027155208103200865
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.654320987654321,
"acc_stderr": 0.02646248777700187,
"acc_norm": 0.654320987654321,
"acc_norm_stderr": 0.02646248777700187
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.425531914893617,
"acc_stderr": 0.02949482760014437,
"acc_norm": 0.425531914893617,
"acc_norm_stderr": 0.02949482760014437
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45045632333767927,
"acc_stderr": 0.012707390438502348,
"acc_norm": 0.45045632333767927,
"acc_norm_stderr": 0.012707390438502348
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5514705882352942,
"acc_stderr": 0.030211479609121596,
"acc_norm": 0.5514705882352942,
"acc_norm_stderr": 0.030211479609121596
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5898692810457516,
"acc_stderr": 0.019898412717635896,
"acc_norm": 0.5898692810457516,
"acc_norm_stderr": 0.019898412717635896
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6612244897959184,
"acc_stderr": 0.030299506562154185,
"acc_norm": 0.6612244897959184,
"acc_norm_stderr": 0.030299506562154185
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7213930348258707,
"acc_stderr": 0.031700561834973086,
"acc_norm": 0.7213930348258707,
"acc_norm_stderr": 0.031700561834973086
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.46987951807228917,
"acc_stderr": 0.03885425420866766,
"acc_norm": 0.46987951807228917,
"acc_norm_stderr": 0.03885425420866766
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.03126781714663179,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.03126781714663179
},
"harness|truthfulqa:mc|0": {
"mc1": 0.36107711138310894,
"mc1_stderr": 0.016814312844836886,
"mc2": 0.5232823914450195,
"mc2_stderr": 0.015871876766326714
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_chargoddard__duplicitous-mammal-13b | 2023-10-08T19:37:38.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of chargoddard/duplicitous-mammal-13b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [chargoddard/duplicitous-mammal-13b](https://huggingface.co/chargoddard/duplicitous-mammal-13b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_chargoddard__duplicitous-mammal-13b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-08T19:36:16.264447](https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__duplicitous-mammal-13b/blob/main/results_2023-10-08T19-36-16.264447.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5764888573860016,\n\
\ \"acc_stderr\": 0.03428954344540111,\n \"acc_norm\": 0.5801512908650484,\n\
\ \"acc_norm_stderr\": 0.03426843961028206,\n \"mc1\": 0.3623011015911873,\n\
\ \"mc1_stderr\": 0.016826646897262258,\n \"mc2\": 0.5227009331210287,\n\
\ \"mc2_stderr\": 0.015839308789732245\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5930034129692833,\n \"acc_stderr\": 0.014356399418009123,\n\
\ \"acc_norm\": 0.6168941979522184,\n \"acc_norm_stderr\": 0.014206472661672877\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6456881099382593,\n\
\ \"acc_stderr\": 0.004773267510112741,\n \"acc_norm\": 0.8378809002190799,\n\
\ \"acc_norm_stderr\": 0.00367806799442448\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939098,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939098\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4888888888888889,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.4888888888888889,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5263157894736842,\n \"acc_stderr\": 0.04063302731486671,\n\
\ \"acc_norm\": 0.5263157894736842,\n \"acc_norm_stderr\": 0.04063302731486671\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6037735849056604,\n \"acc_stderr\": 0.030102793781791197,\n\
\ \"acc_norm\": 0.6037735849056604,\n \"acc_norm_stderr\": 0.030102793781791197\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6388888888888888,\n\
\ \"acc_stderr\": 0.04016660030451233,\n \"acc_norm\": 0.6388888888888888,\n\
\ \"acc_norm_stderr\": 0.04016660030451233\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\"\
: 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5202312138728323,\n\
\ \"acc_stderr\": 0.03809342081273957,\n \"acc_norm\": 0.5202312138728323,\n\
\ \"acc_norm_stderr\": 0.03809342081273957\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.0433643270799318,\n\
\ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.0433643270799318\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.46808510638297873,\n \"acc_stderr\": 0.03261936918467382,\n\
\ \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.03261936918467382\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.041546596717075474,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.041546596717075474\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.35185185185185186,\n \"acc_stderr\": 0.024594975128920938,\n \"\
acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.024594975128920938\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n\
\ \"acc_stderr\": 0.04403438954768176,\n \"acc_norm\": 0.4126984126984127,\n\
\ \"acc_norm_stderr\": 0.04403438954768176\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6451612903225806,\n \"acc_stderr\": 0.027218889773308767,\n \"\
acc_norm\": 0.6451612903225806,\n \"acc_norm_stderr\": 0.027218889773308767\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.45320197044334976,\n \"acc_stderr\": 0.03502544650845872,\n \"\
acc_norm\": 0.45320197044334976,\n \"acc_norm_stderr\": 0.03502544650845872\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.703030303030303,\n \"acc_stderr\": 0.035679697722680495,\n\
\ \"acc_norm\": 0.703030303030303,\n \"acc_norm_stderr\": 0.035679697722680495\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.702020202020202,\n \"acc_stderr\": 0.03258630383836556,\n \"acc_norm\"\
: 0.702020202020202,\n \"acc_norm_stderr\": 0.03258630383836556\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.844559585492228,\n \"acc_stderr\": 0.026148483469153303,\n\
\ \"acc_norm\": 0.844559585492228,\n \"acc_norm_stderr\": 0.026148483469153303\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5564102564102564,\n \"acc_stderr\": 0.025189149894764205,\n\
\ \"acc_norm\": 0.5564102564102564,\n \"acc_norm_stderr\": 0.025189149894764205\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683515,\n \
\ \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683515\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.032252942323996406,\n\
\ \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.032252942323996406\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719197,\n \"\
acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719197\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.763302752293578,\n \"acc_stderr\": 0.018224078117299092,\n \"\
acc_norm\": 0.763302752293578,\n \"acc_norm_stderr\": 0.018224078117299092\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4351851851851852,\n \"acc_stderr\": 0.03381200005643525,\n \"\
acc_norm\": 0.4351851851851852,\n \"acc_norm_stderr\": 0.03381200005643525\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588663,\n \"\
acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588663\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7721518987341772,\n \"acc_stderr\": 0.027303484599069432,\n \
\ \"acc_norm\": 0.7721518987341772,\n \"acc_norm_stderr\": 0.027303484599069432\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n\
\ \"acc_stderr\": 0.030769352008229146,\n \"acc_norm\": 0.6995515695067265,\n\
\ \"acc_norm_stderr\": 0.030769352008229146\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6106870229007634,\n \"acc_stderr\": 0.04276486542814591,\n\
\ \"acc_norm\": 0.6106870229007634,\n \"acc_norm_stderr\": 0.04276486542814591\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7355371900826446,\n \"acc_stderr\": 0.04026187527591207,\n \"\
acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.04026187527591207\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6503067484662577,\n \"acc_stderr\": 0.03746668325470021,\n\
\ \"acc_norm\": 0.6503067484662577,\n \"acc_norm_stderr\": 0.03746668325470021\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6893203883495146,\n \"acc_stderr\": 0.0458212416016155,\n\
\ \"acc_norm\": 0.6893203883495146,\n \"acc_norm_stderr\": 0.0458212416016155\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8290598290598291,\n\
\ \"acc_stderr\": 0.02466249684520982,\n \"acc_norm\": 0.8290598290598291,\n\
\ \"acc_norm_stderr\": 0.02466249684520982\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.58,\n \"acc_stderr\": 0.04960449637488583,\n \
\ \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.04960449637488583\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7803320561941252,\n\
\ \"acc_stderr\": 0.01480538447837115,\n \"acc_norm\": 0.7803320561941252,\n\
\ \"acc_norm_stderr\": 0.01480538447837115\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6502890173410405,\n \"acc_stderr\": 0.025674281456531018,\n\
\ \"acc_norm\": 0.6502890173410405,\n \"acc_norm_stderr\": 0.025674281456531018\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.48156424581005586,\n\
\ \"acc_stderr\": 0.016711130497782813,\n \"acc_norm\": 0.48156424581005586,\n\
\ \"acc_norm_stderr\": 0.016711130497782813\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6241830065359477,\n \"acc_stderr\": 0.027732834353363933,\n\
\ \"acc_norm\": 0.6241830065359477,\n \"acc_norm_stderr\": 0.027732834353363933\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6463022508038585,\n\
\ \"acc_stderr\": 0.027155208103200865,\n \"acc_norm\": 0.6463022508038585,\n\
\ \"acc_norm_stderr\": 0.027155208103200865\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6512345679012346,\n \"acc_stderr\": 0.026517597724465013,\n\
\ \"acc_norm\": 0.6512345679012346,\n \"acc_norm_stderr\": 0.026517597724465013\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4148936170212766,\n \"acc_stderr\": 0.0293922365846125,\n \
\ \"acc_norm\": 0.4148936170212766,\n \"acc_norm_stderr\": 0.0293922365846125\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44654498044328556,\n\
\ \"acc_stderr\": 0.012697046024399684,\n \"acc_norm\": 0.44654498044328556,\n\
\ \"acc_norm_stderr\": 0.012697046024399684\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5404411764705882,\n \"acc_stderr\": 0.030273325077345755,\n\
\ \"acc_norm\": 0.5404411764705882,\n \"acc_norm_stderr\": 0.030273325077345755\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5915032679738562,\n \"acc_stderr\": 0.01988622103750187,\n \
\ \"acc_norm\": 0.5915032679738562,\n \"acc_norm_stderr\": 0.01988622103750187\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6612244897959184,\n \"acc_stderr\": 0.030299506562154185,\n\
\ \"acc_norm\": 0.6612244897959184,\n \"acc_norm_stderr\": 0.030299506562154185\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7313432835820896,\n\
\ \"acc_stderr\": 0.03134328358208955,\n \"acc_norm\": 0.7313432835820896,\n\
\ \"acc_norm_stderr\": 0.03134328358208955\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774707,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774707\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.46987951807228917,\n\
\ \"acc_stderr\": 0.03885425420866766,\n \"acc_norm\": 0.46987951807228917,\n\
\ \"acc_norm_stderr\": 0.03885425420866766\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.03094445977853321,\n\
\ \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.03094445977853321\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3623011015911873,\n\
\ \"mc1_stderr\": 0.016826646897262258,\n \"mc2\": 0.5227009331210287,\n\
\ \"mc2_stderr\": 0.015839308789732245\n }\n}\n```"
repo_url: https://huggingface.co/chargoddard/duplicitous-mammal-13b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_08T19_36_16.264447
path:
- '**/details_harness|arc:challenge|25_2023-10-08T19-36-16.264447.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-08T19-36-16.264447.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_08T19_36_16.264447
path:
- '**/details_harness|hellaswag|10_2023-10-08T19-36-16.264447.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-08T19-36-16.264447.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_08T19_36_16.264447
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T19-36-16.264447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T19-36-16.264447.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T19-36-16.264447.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_08T19_36_16.264447
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T19-36-16.264447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T19-36-16.264447.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_08T19_36_16.264447
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T19-36-16.264447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T19-36-16.264447.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_08T19_36_16.264447
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T19-36-16.264447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T19-36-16.264447.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_08T19_36_16.264447
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T19-36-16.264447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T19-36-16.264447.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_08T19_36_16.264447
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T19-36-16.264447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T19-36-16.264447.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_08T19_36_16.264447
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T19-36-16.264447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T19-36-16.264447.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_08T19_36_16.264447
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T19-36-16.264447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T19-36-16.264447.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_08T19_36_16.264447
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T19-36-16.264447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T19-36-16.264447.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_08T19_36_16.264447
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T19-36-16.264447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T19-36-16.264447.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_08T19_36_16.264447
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T19-36-16.264447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T19-36-16.264447.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_08T19_36_16.264447
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T19-36-16.264447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T19-36-16.264447.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_08T19_36_16.264447
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T19-36-16.264447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T19-36-16.264447.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_08T19_36_16.264447
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T19-36-16.264447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T19-36-16.264447.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_08T19_36_16.264447
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T19-36-16.264447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T19-36-16.264447.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_08T19_36_16.264447
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T19-36-16.264447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T19-36-16.264447.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_08T19_36_16.264447
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T19-36-16.264447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T19-36-16.264447.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_08T19_36_16.264447
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T19-36-16.264447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T19-36-16.264447.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_08T19_36_16.264447
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T19-36-16.264447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T19-36-16.264447.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_08T19_36_16.264447
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T19-36-16.264447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T19-36-16.264447.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_08T19_36_16.264447
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T19-36-16.264447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T19-36-16.264447.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_08T19_36_16.264447
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T19-36-16.264447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T19-36-16.264447.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_08T19_36_16.264447
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T19-36-16.264447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T19-36-16.264447.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_08T19_36_16.264447
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T19-36-16.264447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T19-36-16.264447.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_08T19_36_16.264447
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T19-36-16.264447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T19-36-16.264447.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_08T19_36_16.264447
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T19-36-16.264447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T19-36-16.264447.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_08T19_36_16.264447
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T19-36-16.264447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T19-36-16.264447.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_08T19_36_16.264447
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T19-36-16.264447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T19-36-16.264447.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_08T19_36_16.264447
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T19-36-16.264447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T19-36-16.264447.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_08T19_36_16.264447
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T19-36-16.264447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T19-36-16.264447.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_08T19_36_16.264447
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T19-36-16.264447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T19-36-16.264447.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_08T19_36_16.264447
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T19-36-16.264447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T19-36-16.264447.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_08T19_36_16.264447
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T19-36-16.264447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T19-36-16.264447.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_08T19_36_16.264447
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T19-36-16.264447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T19-36-16.264447.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_08T19_36_16.264447
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T19-36-16.264447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T19-36-16.264447.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_08T19_36_16.264447
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T19-36-16.264447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T19-36-16.264447.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_08T19_36_16.264447
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T19-36-16.264447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T19-36-16.264447.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_08T19_36_16.264447
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T19-36-16.264447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T19-36-16.264447.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_08T19_36_16.264447
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T19-36-16.264447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T19-36-16.264447.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_08T19_36_16.264447
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-08T19-36-16.264447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-08T19-36-16.264447.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_08T19_36_16.264447
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T19-36-16.264447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T19-36-16.264447.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_08T19_36_16.264447
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T19-36-16.264447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T19-36-16.264447.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_08T19_36_16.264447
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T19-36-16.264447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T19-36-16.264447.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_08T19_36_16.264447
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T19-36-16.264447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T19-36-16.264447.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_08T19_36_16.264447
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T19-36-16.264447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T19-36-16.264447.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_08T19_36_16.264447
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T19-36-16.264447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T19-36-16.264447.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_08T19_36_16.264447
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T19-36-16.264447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T19-36-16.264447.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_08T19_36_16.264447
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T19-36-16.264447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T19-36-16.264447.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_08T19_36_16.264447
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T19-36-16.264447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T19-36-16.264447.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_08T19_36_16.264447
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T19-36-16.264447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T19-36-16.264447.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_08T19_36_16.264447
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T19-36-16.264447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T19-36-16.264447.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_08T19_36_16.264447
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T19-36-16.264447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T19-36-16.264447.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_08T19_36_16.264447
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T19-36-16.264447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T19-36-16.264447.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_08T19_36_16.264447
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T19-36-16.264447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T19-36-16.264447.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_08T19_36_16.264447
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T19-36-16.264447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T19-36-16.264447.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_08T19_36_16.264447
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T19-36-16.264447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T19-36-16.264447.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_08T19_36_16.264447
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T19-36-16.264447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T19-36-16.264447.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_08T19_36_16.264447
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T19-36-16.264447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T19-36-16.264447.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_08T19_36_16.264447
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-08T19-36-16.264447.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-08T19-36-16.264447.parquet'
- config_name: results
data_files:
- split: 2023_10_08T19_36_16.264447
path:
- results_2023-10-08T19-36-16.264447.parquet
- split: latest
path:
- results_2023-10-08T19-36-16.264447.parquet
---
# Dataset Card for Evaluation run of chargoddard/duplicitous-mammal-13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/chargoddard/duplicitous-mammal-13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [chargoddard/duplicitous-mammal-13b](https://huggingface.co/chargoddard/duplicitous-mammal-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_chargoddard__duplicitous-mammal-13b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-08T19:36:16.264447](https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__duplicitous-mammal-13b/blob/main/results_2023-10-08T19-36-16.264447.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5764888573860016,
"acc_stderr": 0.03428954344540111,
"acc_norm": 0.5801512908650484,
"acc_norm_stderr": 0.03426843961028206,
"mc1": 0.3623011015911873,
"mc1_stderr": 0.016826646897262258,
"mc2": 0.5227009331210287,
"mc2_stderr": 0.015839308789732245
},
"harness|arc:challenge|25": {
"acc": 0.5930034129692833,
"acc_stderr": 0.014356399418009123,
"acc_norm": 0.6168941979522184,
"acc_norm_stderr": 0.014206472661672877
},
"harness|hellaswag|10": {
"acc": 0.6456881099382593,
"acc_stderr": 0.004773267510112741,
"acc_norm": 0.8378809002190799,
"acc_norm_stderr": 0.00367806799442448
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939098,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939098
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4888888888888889,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.4888888888888889,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5263157894736842,
"acc_stderr": 0.04063302731486671,
"acc_norm": 0.5263157894736842,
"acc_norm_stderr": 0.04063302731486671
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6037735849056604,
"acc_stderr": 0.030102793781791197,
"acc_norm": 0.6037735849056604,
"acc_norm_stderr": 0.030102793781791197
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.04016660030451233,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.04016660030451233
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5202312138728323,
"acc_stderr": 0.03809342081273957,
"acc_norm": 0.5202312138728323,
"acc_norm_stderr": 0.03809342081273957
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.0433643270799318,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.0433643270799318
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.03261936918467382,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.03261936918467382
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159394,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159394
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.041546596717075474,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.041546596717075474
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.024594975128920938,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.024594975128920938
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768176,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768176
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6451612903225806,
"acc_stderr": 0.027218889773308767,
"acc_norm": 0.6451612903225806,
"acc_norm_stderr": 0.027218889773308767
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.45320197044334976,
"acc_stderr": 0.03502544650845872,
"acc_norm": 0.45320197044334976,
"acc_norm_stderr": 0.03502544650845872
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.703030303030303,
"acc_stderr": 0.035679697722680495,
"acc_norm": 0.703030303030303,
"acc_norm_stderr": 0.035679697722680495
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.702020202020202,
"acc_stderr": 0.03258630383836556,
"acc_norm": 0.702020202020202,
"acc_norm_stderr": 0.03258630383836556
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.844559585492228,
"acc_stderr": 0.026148483469153303,
"acc_norm": 0.844559585492228,
"acc_norm_stderr": 0.026148483469153303
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5564102564102564,
"acc_stderr": 0.025189149894764205,
"acc_norm": 0.5564102564102564,
"acc_norm_stderr": 0.025189149894764205
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.028226446749683515,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.028226446749683515
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.032252942323996406,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.032252942323996406
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3841059602649007,
"acc_stderr": 0.03971301814719197,
"acc_norm": 0.3841059602649007,
"acc_norm_stderr": 0.03971301814719197
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.763302752293578,
"acc_stderr": 0.018224078117299092,
"acc_norm": 0.763302752293578,
"acc_norm_stderr": 0.018224078117299092
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4351851851851852,
"acc_stderr": 0.03381200005643525,
"acc_norm": 0.4351851851851852,
"acc_norm_stderr": 0.03381200005643525
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588663,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588663
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7721518987341772,
"acc_stderr": 0.027303484599069432,
"acc_norm": 0.7721518987341772,
"acc_norm_stderr": 0.027303484599069432
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.030769352008229146,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.030769352008229146
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6106870229007634,
"acc_stderr": 0.04276486542814591,
"acc_norm": 0.6106870229007634,
"acc_norm_stderr": 0.04276486542814591
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.04026187527591207,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.04026187527591207
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6503067484662577,
"acc_stderr": 0.03746668325470021,
"acc_norm": 0.6503067484662577,
"acc_norm_stderr": 0.03746668325470021
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.6893203883495146,
"acc_stderr": 0.0458212416016155,
"acc_norm": 0.6893203883495146,
"acc_norm_stderr": 0.0458212416016155
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8290598290598291,
"acc_stderr": 0.02466249684520982,
"acc_norm": 0.8290598290598291,
"acc_norm_stderr": 0.02466249684520982
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.58,
"acc_stderr": 0.04960449637488583,
"acc_norm": 0.58,
"acc_norm_stderr": 0.04960449637488583
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7803320561941252,
"acc_stderr": 0.01480538447837115,
"acc_norm": 0.7803320561941252,
"acc_norm_stderr": 0.01480538447837115
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6502890173410405,
"acc_stderr": 0.025674281456531018,
"acc_norm": 0.6502890173410405,
"acc_norm_stderr": 0.025674281456531018
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.48156424581005586,
"acc_stderr": 0.016711130497782813,
"acc_norm": 0.48156424581005586,
"acc_norm_stderr": 0.016711130497782813
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6241830065359477,
"acc_stderr": 0.027732834353363933,
"acc_norm": 0.6241830065359477,
"acc_norm_stderr": 0.027732834353363933
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6463022508038585,
"acc_stderr": 0.027155208103200865,
"acc_norm": 0.6463022508038585,
"acc_norm_stderr": 0.027155208103200865
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6512345679012346,
"acc_stderr": 0.026517597724465013,
"acc_norm": 0.6512345679012346,
"acc_norm_stderr": 0.026517597724465013
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4148936170212766,
"acc_stderr": 0.0293922365846125,
"acc_norm": 0.4148936170212766,
"acc_norm_stderr": 0.0293922365846125
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44654498044328556,
"acc_stderr": 0.012697046024399684,
"acc_norm": 0.44654498044328556,
"acc_norm_stderr": 0.012697046024399684
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5404411764705882,
"acc_stderr": 0.030273325077345755,
"acc_norm": 0.5404411764705882,
"acc_norm_stderr": 0.030273325077345755
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5915032679738562,
"acc_stderr": 0.01988622103750187,
"acc_norm": 0.5915032679738562,
"acc_norm_stderr": 0.01988622103750187
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6612244897959184,
"acc_stderr": 0.030299506562154185,
"acc_norm": 0.6612244897959184,
"acc_norm_stderr": 0.030299506562154185
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7313432835820896,
"acc_stderr": 0.03134328358208955,
"acc_norm": 0.7313432835820896,
"acc_norm_stderr": 0.03134328358208955
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774707,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774707
},
"harness|hendrycksTest-virology|5": {
"acc": 0.46987951807228917,
"acc_stderr": 0.03885425420866766,
"acc_norm": 0.46987951807228917,
"acc_norm_stderr": 0.03885425420866766
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7953216374269005,
"acc_stderr": 0.03094445977853321,
"acc_norm": 0.7953216374269005,
"acc_norm_stderr": 0.03094445977853321
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3623011015911873,
"mc1_stderr": 0.016826646897262258,
"mc2": 0.5227009331210287,
"mc2_stderr": 0.015839308789732245
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
trunghlt/stem-wiki-articles-based-on-cohere-embedding | 2023-10-08T19:40:55.000Z | [
"license:cc-by-4.0",
"region:us"
] | trunghlt | null | null | null | 0 | 0 | ---
license: cc-by-4.0
---
|
open-llm-leaderboard/details_caisarl76__mistral-guanaco1k-ep2 | 2023-10-08T19:45:11.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of caisarl76/mistral-guanaco1k-ep2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [caisarl76/mistral-guanaco1k-ep2](https://huggingface.co/caisarl76/mistral-guanaco1k-ep2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_caisarl76__mistral-guanaco1k-ep2\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-08T19:43:46.755661](https://huggingface.co/datasets/open-llm-leaderboard/details_caisarl76__mistral-guanaco1k-ep2/blob/main/results_2023-10-08T19-43-46.755661.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6143042166493967,\n\
\ \"acc_stderr\": 0.03338971893001228,\n \"acc_norm\": 0.618336439639903,\n\
\ \"acc_norm_stderr\": 0.03336872879395508,\n \"mc1\": 0.36474908200734396,\n\
\ \"mc1_stderr\": 0.01685096106172011,\n \"mc2\": 0.543989419660622,\n\
\ \"mc2_stderr\": 0.014697499688148498\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5631399317406144,\n \"acc_stderr\": 0.014494421584256515,\n\
\ \"acc_norm\": 0.6006825938566553,\n \"acc_norm_stderr\": 0.014312094557946705\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6272654849631547,\n\
\ \"acc_stderr\": 0.004825441080261185,\n \"acc_norm\": 0.8276239792869946,\n\
\ \"acc_norm_stderr\": 0.0037693500791958923\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.625,\n \"acc_stderr\": 0.039397364351956274,\n \
\ \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.039397364351956274\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6566037735849056,\n \"acc_stderr\": 0.02922452646912479,\n\
\ \"acc_norm\": 0.6566037735849056,\n \"acc_norm_stderr\": 0.02922452646912479\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6944444444444444,\n\
\ \"acc_stderr\": 0.03852084696008534,\n \"acc_norm\": 0.6944444444444444,\n\
\ \"acc_norm_stderr\": 0.03852084696008534\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\"\
: 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5722543352601156,\n\
\ \"acc_stderr\": 0.03772446857518027,\n \"acc_norm\": 0.5722543352601156,\n\
\ \"acc_norm_stderr\": 0.03772446857518027\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929777,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929777\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5361702127659574,\n \"acc_stderr\": 0.03260038511835771,\n\
\ \"acc_norm\": 0.5361702127659574,\n \"acc_norm_stderr\": 0.03260038511835771\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482758,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482758\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3888888888888889,\n \"acc_stderr\": 0.025107425481137285,\n \"\
acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.025107425481137285\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n\
\ \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n\
\ \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7354838709677419,\n\
\ \"acc_stderr\": 0.02509189237885928,\n \"acc_norm\": 0.7354838709677419,\n\
\ \"acc_norm_stderr\": 0.02509189237885928\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n\
\ \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7212121212121212,\n \"acc_stderr\": 0.03501438706296781,\n\
\ \"acc_norm\": 0.7212121212121212,\n \"acc_norm_stderr\": 0.03501438706296781\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7727272727272727,\n \"acc_stderr\": 0.02985751567338642,\n \"\
acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.02985751567338642\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.844559585492228,\n \"acc_stderr\": 0.02614848346915331,\n\
\ \"acc_norm\": 0.844559585492228,\n \"acc_norm_stderr\": 0.02614848346915331\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6153846153846154,\n \"acc_stderr\": 0.024666744915187208,\n\
\ \"acc_norm\": 0.6153846153846154,\n \"acc_norm_stderr\": 0.024666744915187208\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n\
\ \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.03879687024073327,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.03879687024073327\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7963302752293578,\n \"acc_stderr\": 0.017266742087630783,\n \"\
acc_norm\": 0.7963302752293578,\n \"acc_norm_stderr\": 0.017266742087630783\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.48148148148148145,\n \"acc_stderr\": 0.034076320938540516,\n \"\
acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.034076320938540516\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7990196078431373,\n \"acc_stderr\": 0.028125972265654373,\n \"\
acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.028125972265654373\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229966,\n \
\ \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229966\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6547085201793722,\n\
\ \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.6547085201793722,\n\
\ \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596913,\n\
\ \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596913\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7037037037037037,\n\
\ \"acc_stderr\": 0.04414343666854933,\n \"acc_norm\": 0.7037037037037037,\n\
\ \"acc_norm_stderr\": 0.04414343666854933\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615771,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615771\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n\
\ \"acc_stderr\": 0.04669510663875191,\n \"acc_norm\": 0.4107142857142857,\n\
\ \"acc_norm_stderr\": 0.04669510663875191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.043546310772605956,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.043546310772605956\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n\
\ \"acc_stderr\": 0.022209309073165616,\n \"acc_norm\": 0.8675213675213675,\n\
\ \"acc_norm_stderr\": 0.022209309073165616\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7943805874840357,\n\
\ \"acc_stderr\": 0.01445250045678583,\n \"acc_norm\": 0.7943805874840357,\n\
\ \"acc_norm_stderr\": 0.01445250045678583\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7052023121387283,\n \"acc_stderr\": 0.024547617794803824,\n\
\ \"acc_norm\": 0.7052023121387283,\n \"acc_norm_stderr\": 0.024547617794803824\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2659217877094972,\n\
\ \"acc_stderr\": 0.01477676506643889,\n \"acc_norm\": 0.2659217877094972,\n\
\ \"acc_norm_stderr\": 0.01477676506643889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6993464052287581,\n \"acc_stderr\": 0.026256053835718964,\n\
\ \"acc_norm\": 0.6993464052287581,\n \"acc_norm_stderr\": 0.026256053835718964\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\
\ \"acc_stderr\": 0.02600330111788514,\n \"acc_norm\": 0.7009646302250804,\n\
\ \"acc_norm_stderr\": 0.02600330111788514\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.02492200116888633,\n\
\ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.02492200116888633\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46099290780141844,\n \"acc_stderr\": 0.029736592526424438,\n \
\ \"acc_norm\": 0.46099290780141844,\n \"acc_norm_stderr\": 0.029736592526424438\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4282920469361147,\n\
\ \"acc_stderr\": 0.012638223880313168,\n \"acc_norm\": 0.4282920469361147,\n\
\ \"acc_norm_stderr\": 0.012638223880313168\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.029029422815681397,\n\
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.029029422815681397\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6339869281045751,\n \"acc_stderr\": 0.01948802574552967,\n \
\ \"acc_norm\": 0.6339869281045751,\n \"acc_norm_stderr\": 0.01948802574552967\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.02853556033712846,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.02853556033712846\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.035887028128263734,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.035887028128263734\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.03061111655743253,\n\
\ \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.03061111655743253\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.36474908200734396,\n\
\ \"mc1_stderr\": 0.01685096106172011,\n \"mc2\": 0.543989419660622,\n\
\ \"mc2_stderr\": 0.014697499688148498\n }\n}\n```"
repo_url: https://huggingface.co/caisarl76/mistral-guanaco1k-ep2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_08T19_43_46.755661
path:
- '**/details_harness|arc:challenge|25_2023-10-08T19-43-46.755661.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-08T19-43-46.755661.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_08T19_43_46.755661
path:
- '**/details_harness|hellaswag|10_2023-10-08T19-43-46.755661.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-08T19-43-46.755661.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_08T19_43_46.755661
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T19-43-46.755661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T19-43-46.755661.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T19-43-46.755661.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_08T19_43_46.755661
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T19-43-46.755661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T19-43-46.755661.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_08T19_43_46.755661
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T19-43-46.755661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T19-43-46.755661.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_08T19_43_46.755661
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T19-43-46.755661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T19-43-46.755661.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_08T19_43_46.755661
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T19-43-46.755661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T19-43-46.755661.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_08T19_43_46.755661
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T19-43-46.755661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T19-43-46.755661.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_08T19_43_46.755661
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T19-43-46.755661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T19-43-46.755661.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_08T19_43_46.755661
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T19-43-46.755661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T19-43-46.755661.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_08T19_43_46.755661
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T19-43-46.755661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T19-43-46.755661.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_08T19_43_46.755661
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T19-43-46.755661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T19-43-46.755661.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_08T19_43_46.755661
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T19-43-46.755661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T19-43-46.755661.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_08T19_43_46.755661
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T19-43-46.755661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T19-43-46.755661.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_08T19_43_46.755661
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T19-43-46.755661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T19-43-46.755661.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_08T19_43_46.755661
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T19-43-46.755661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T19-43-46.755661.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_08T19_43_46.755661
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T19-43-46.755661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T19-43-46.755661.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_08T19_43_46.755661
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T19-43-46.755661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T19-43-46.755661.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_08T19_43_46.755661
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T19-43-46.755661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T19-43-46.755661.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_08T19_43_46.755661
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T19-43-46.755661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T19-43-46.755661.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_08T19_43_46.755661
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T19-43-46.755661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T19-43-46.755661.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_08T19_43_46.755661
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T19-43-46.755661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T19-43-46.755661.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_08T19_43_46.755661
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T19-43-46.755661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T19-43-46.755661.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_08T19_43_46.755661
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T19-43-46.755661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T19-43-46.755661.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_08T19_43_46.755661
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T19-43-46.755661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T19-43-46.755661.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_08T19_43_46.755661
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T19-43-46.755661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T19-43-46.755661.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_08T19_43_46.755661
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T19-43-46.755661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T19-43-46.755661.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_08T19_43_46.755661
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T19-43-46.755661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T19-43-46.755661.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_08T19_43_46.755661
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T19-43-46.755661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T19-43-46.755661.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_08T19_43_46.755661
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T19-43-46.755661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T19-43-46.755661.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_08T19_43_46.755661
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T19-43-46.755661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T19-43-46.755661.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_08T19_43_46.755661
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T19-43-46.755661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T19-43-46.755661.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_08T19_43_46.755661
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T19-43-46.755661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T19-43-46.755661.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_08T19_43_46.755661
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T19-43-46.755661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T19-43-46.755661.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_08T19_43_46.755661
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T19-43-46.755661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T19-43-46.755661.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_08T19_43_46.755661
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T19-43-46.755661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T19-43-46.755661.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_08T19_43_46.755661
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T19-43-46.755661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T19-43-46.755661.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_08T19_43_46.755661
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T19-43-46.755661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T19-43-46.755661.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_08T19_43_46.755661
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T19-43-46.755661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T19-43-46.755661.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_08T19_43_46.755661
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T19-43-46.755661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T19-43-46.755661.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_08T19_43_46.755661
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T19-43-46.755661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T19-43-46.755661.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_08T19_43_46.755661
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-08T19-43-46.755661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-08T19-43-46.755661.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_08T19_43_46.755661
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T19-43-46.755661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T19-43-46.755661.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_08T19_43_46.755661
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T19-43-46.755661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T19-43-46.755661.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_08T19_43_46.755661
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T19-43-46.755661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T19-43-46.755661.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_08T19_43_46.755661
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T19-43-46.755661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T19-43-46.755661.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_08T19_43_46.755661
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T19-43-46.755661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T19-43-46.755661.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_08T19_43_46.755661
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T19-43-46.755661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T19-43-46.755661.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_08T19_43_46.755661
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T19-43-46.755661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T19-43-46.755661.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_08T19_43_46.755661
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T19-43-46.755661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T19-43-46.755661.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_08T19_43_46.755661
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T19-43-46.755661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T19-43-46.755661.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_08T19_43_46.755661
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T19-43-46.755661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T19-43-46.755661.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_08T19_43_46.755661
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T19-43-46.755661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T19-43-46.755661.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_08T19_43_46.755661
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T19-43-46.755661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T19-43-46.755661.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_08T19_43_46.755661
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T19-43-46.755661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T19-43-46.755661.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_08T19_43_46.755661
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T19-43-46.755661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T19-43-46.755661.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_08T19_43_46.755661
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T19-43-46.755661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T19-43-46.755661.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_08T19_43_46.755661
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T19-43-46.755661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T19-43-46.755661.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_08T19_43_46.755661
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T19-43-46.755661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T19-43-46.755661.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_08T19_43_46.755661
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T19-43-46.755661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T19-43-46.755661.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_08T19_43_46.755661
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-08T19-43-46.755661.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-08T19-43-46.755661.parquet'
- config_name: results
data_files:
- split: 2023_10_08T19_43_46.755661
path:
- results_2023-10-08T19-43-46.755661.parquet
- split: latest
path:
- results_2023-10-08T19-43-46.755661.parquet
---
# Dataset Card for Evaluation run of caisarl76/mistral-guanaco1k-ep2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/caisarl76/mistral-guanaco1k-ep2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [caisarl76/mistral-guanaco1k-ep2](https://huggingface.co/caisarl76/mistral-guanaco1k-ep2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_caisarl76__mistral-guanaco1k-ep2",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-08T19:43:46.755661](https://huggingface.co/datasets/open-llm-leaderboard/details_caisarl76__mistral-guanaco1k-ep2/blob/main/results_2023-10-08T19-43-46.755661.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6143042166493967,
"acc_stderr": 0.03338971893001228,
"acc_norm": 0.618336439639903,
"acc_norm_stderr": 0.03336872879395508,
"mc1": 0.36474908200734396,
"mc1_stderr": 0.01685096106172011,
"mc2": 0.543989419660622,
"mc2_stderr": 0.014697499688148498
},
"harness|arc:challenge|25": {
"acc": 0.5631399317406144,
"acc_stderr": 0.014494421584256515,
"acc_norm": 0.6006825938566553,
"acc_norm_stderr": 0.014312094557946705
},
"harness|hellaswag|10": {
"acc": 0.6272654849631547,
"acc_stderr": 0.004825441080261185,
"acc_norm": 0.8276239792869946,
"acc_norm_stderr": 0.0037693500791958923
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.625,
"acc_stderr": 0.039397364351956274,
"acc_norm": 0.625,
"acc_norm_stderr": 0.039397364351956274
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6566037735849056,
"acc_stderr": 0.02922452646912479,
"acc_norm": 0.6566037735849056,
"acc_norm_stderr": 0.02922452646912479
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.03852084696008534,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.03852084696008534
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5722543352601156,
"acc_stderr": 0.03772446857518027,
"acc_norm": 0.5722543352601156,
"acc_norm_stderr": 0.03772446857518027
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.04533838195929777,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.04533838195929777
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5361702127659574,
"acc_stderr": 0.03260038511835771,
"acc_norm": 0.5361702127659574,
"acc_norm_stderr": 0.03260038511835771
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.025107425481137285,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.025107425481137285
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7354838709677419,
"acc_stderr": 0.02509189237885928,
"acc_norm": 0.7354838709677419,
"acc_norm_stderr": 0.02509189237885928
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7212121212121212,
"acc_stderr": 0.03501438706296781,
"acc_norm": 0.7212121212121212,
"acc_norm_stderr": 0.03501438706296781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.02985751567338642,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.02985751567338642
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.844559585492228,
"acc_stderr": 0.02614848346915331,
"acc_norm": 0.844559585492228,
"acc_norm_stderr": 0.02614848346915331
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6153846153846154,
"acc_stderr": 0.024666744915187208,
"acc_norm": 0.6153846153846154,
"acc_norm_stderr": 0.024666744915187208
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.03879687024073327,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.03879687024073327
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7963302752293578,
"acc_stderr": 0.017266742087630783,
"acc_norm": 0.7963302752293578,
"acc_norm_stderr": 0.017266742087630783
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.034076320938540516,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.034076320938540516
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.028125972265654373,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.028125972265654373
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7763713080168776,
"acc_stderr": 0.027123298205229966,
"acc_norm": 0.7763713080168776,
"acc_norm_stderr": 0.027123298205229966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6547085201793722,
"acc_stderr": 0.03191100192835794,
"acc_norm": 0.6547085201793722,
"acc_norm_stderr": 0.03191100192835794
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596913,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596913
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.04414343666854933,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.04414343666854933
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615771,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4107142857142857,
"acc_stderr": 0.04669510663875191,
"acc_norm": 0.4107142857142857,
"acc_norm_stderr": 0.04669510663875191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.043546310772605956,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.043546310772605956
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165616,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165616
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7943805874840357,
"acc_stderr": 0.01445250045678583,
"acc_norm": 0.7943805874840357,
"acc_norm_stderr": 0.01445250045678583
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7052023121387283,
"acc_stderr": 0.024547617794803824,
"acc_norm": 0.7052023121387283,
"acc_norm_stderr": 0.024547617794803824
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2659217877094972,
"acc_stderr": 0.01477676506643889,
"acc_norm": 0.2659217877094972,
"acc_norm_stderr": 0.01477676506643889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6993464052287581,
"acc_stderr": 0.026256053835718964,
"acc_norm": 0.6993464052287581,
"acc_norm_stderr": 0.026256053835718964
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.02600330111788514,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.02600330111788514
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.02492200116888633,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.02492200116888633
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46099290780141844,
"acc_stderr": 0.029736592526424438,
"acc_norm": 0.46099290780141844,
"acc_norm_stderr": 0.029736592526424438
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4282920469361147,
"acc_stderr": 0.012638223880313168,
"acc_norm": 0.4282920469361147,
"acc_norm_stderr": 0.012638223880313168
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.029029422815681397,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.029029422815681397
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6339869281045751,
"acc_stderr": 0.01948802574552967,
"acc_norm": 0.6339869281045751,
"acc_norm_stderr": 0.01948802574552967
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.04607582090719976,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.04607582090719976
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.02853556033712846,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.02853556033712846
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.035887028128263734,
"acc_norm": 0.85,
"acc_norm_stderr": 0.035887028128263734
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.03061111655743253,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.03061111655743253
},
"harness|truthfulqa:mc|0": {
"mc1": 0.36474908200734396,
"mc1_stderr": 0.01685096106172011,
"mc2": 0.543989419660622,
"mc2_stderr": 0.014697499688148498
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Ayansk11/test | 2023-10-08T19:50:56.000Z | [
"region:us"
] | Ayansk11 | null | null | null | 0 | 0 | Entry not found |
OwenHook/MicroCam | 2023-10-08T19:46:01.000Z | [
"license:apache-2.0",
"region:us"
] | OwenHook | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
Ayansk11/llama2_merged_file11 | 2023-10-08T19:52:41.000Z | [
"region:us"
] | Ayansk11 | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_beaugogh__Llama2-7b-openorca-mc-v2-dpo | 2023-10-08T19:53:46.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of beaugogh/Llama2-7b-openorca-mc-v2-dpo
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [beaugogh/Llama2-7b-openorca-mc-v2-dpo](https://huggingface.co/beaugogh/Llama2-7b-openorca-mc-v2-dpo)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_beaugogh__Llama2-7b-openorca-mc-v2-dpo\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-08T19:52:28.810718](https://huggingface.co/datasets/open-llm-leaderboard/details_beaugogh__Llama2-7b-openorca-mc-v2-dpo/blob/main/results_2023-10-08T19-52-28.810718.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4750712786013465,\n\
\ \"acc_stderr\": 0.03526846447027585,\n \"acc_norm\": 0.4790877754279391,\n\
\ \"acc_norm_stderr\": 0.035251105875716136,\n \"mc1\": 0.3635250917992656,\n\
\ \"mc1_stderr\": 0.016838862883965827,\n \"mc2\": 0.531348782604687,\n\
\ \"mc2_stderr\": 0.015530602873554405\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5017064846416383,\n \"acc_stderr\": 0.01461130570505699,\n\
\ \"acc_norm\": 0.5477815699658704,\n \"acc_norm_stderr\": 0.014544519880633829\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6238797052380004,\n\
\ \"acc_stderr\": 0.00483420796406132,\n \"acc_norm\": 0.8147779326827326,\n\
\ \"acc_norm_stderr\": 0.0038768367094611364\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.43703703703703706,\n\
\ \"acc_stderr\": 0.04284958639753399,\n \"acc_norm\": 0.43703703703703706,\n\
\ \"acc_norm_stderr\": 0.04284958639753399\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.48026315789473684,\n \"acc_stderr\": 0.040657710025626036,\n\
\ \"acc_norm\": 0.48026315789473684,\n \"acc_norm_stderr\": 0.040657710025626036\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.49,\n\
\ \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.49,\n \
\ \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5547169811320755,\n \"acc_stderr\": 0.030588052974270655,\n\
\ \"acc_norm\": 0.5547169811320755,\n \"acc_norm_stderr\": 0.030588052974270655\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4513888888888889,\n\
\ \"acc_stderr\": 0.04161402398403279,\n \"acc_norm\": 0.4513888888888889,\n\
\ \"acc_norm_stderr\": 0.04161402398403279\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n\
\ \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4277456647398844,\n\
\ \"acc_stderr\": 0.03772446857518026,\n \"acc_norm\": 0.4277456647398844,\n\
\ \"acc_norm_stderr\": 0.03772446857518026\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.04389869956808778,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.04389869956808778\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"\
acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4425531914893617,\n \"acc_stderr\": 0.032469569197899575,\n\
\ \"acc_norm\": 0.4425531914893617,\n \"acc_norm_stderr\": 0.032469569197899575\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.22807017543859648,\n\
\ \"acc_stderr\": 0.03947152782669415,\n \"acc_norm\": 0.22807017543859648,\n\
\ \"acc_norm_stderr\": 0.03947152782669415\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4689655172413793,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.4689655172413793,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3306878306878307,\n \"acc_stderr\": 0.024229965298425072,\n \"\
acc_norm\": 0.3306878306878307,\n \"acc_norm_stderr\": 0.024229965298425072\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.04216370213557835,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.04216370213557835\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5129032258064516,\n\
\ \"acc_stderr\": 0.02843453315268186,\n \"acc_norm\": 0.5129032258064516,\n\
\ \"acc_norm_stderr\": 0.02843453315268186\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3793103448275862,\n \"acc_stderr\": 0.03413963805906235,\n\
\ \"acc_norm\": 0.3793103448275862,\n \"acc_norm_stderr\": 0.03413963805906235\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\"\
: 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5878787878787879,\n \"acc_stderr\": 0.03843566993588718,\n\
\ \"acc_norm\": 0.5878787878787879,\n \"acc_norm_stderr\": 0.03843566993588718\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5303030303030303,\n \"acc_stderr\": 0.0355580405176393,\n \"acc_norm\"\
: 0.5303030303030303,\n \"acc_norm_stderr\": 0.0355580405176393\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.6787564766839378,\n \"acc_stderr\": 0.033699508685490674,\n\
\ \"acc_norm\": 0.6787564766839378,\n \"acc_norm_stderr\": 0.033699508685490674\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.44871794871794873,\n \"acc_stderr\": 0.025217315184846482,\n\
\ \"acc_norm\": 0.44871794871794873,\n \"acc_norm_stderr\": 0.025217315184846482\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.28888888888888886,\n \"acc_stderr\": 0.027634907264178544,\n \
\ \"acc_norm\": 0.28888888888888886,\n \"acc_norm_stderr\": 0.027634907264178544\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.42436974789915966,\n \"acc_stderr\": 0.03210479051015776,\n\
\ \"acc_norm\": 0.42436974789915966,\n \"acc_norm_stderr\": 0.03210479051015776\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6403669724770642,\n \"acc_stderr\": 0.020575234660123776,\n \"\
acc_norm\": 0.6403669724770642,\n \"acc_norm_stderr\": 0.020575234660123776\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.28703703703703703,\n \"acc_stderr\": 0.030851992993257013,\n \"\
acc_norm\": 0.28703703703703703,\n \"acc_norm_stderr\": 0.030851992993257013\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6519607843137255,\n \"acc_stderr\": 0.03343311240488418,\n \"\
acc_norm\": 0.6519607843137255,\n \"acc_norm_stderr\": 0.03343311240488418\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7046413502109705,\n \"acc_stderr\": 0.02969633871342288,\n \
\ \"acc_norm\": 0.7046413502109705,\n \"acc_norm_stderr\": 0.02969633871342288\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5650224215246636,\n\
\ \"acc_stderr\": 0.033272833702713445,\n \"acc_norm\": 0.5650224215246636,\n\
\ \"acc_norm_stderr\": 0.033272833702713445\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5648854961832062,\n \"acc_stderr\": 0.04348208051644858,\n\
\ \"acc_norm\": 0.5648854961832062,\n \"acc_norm_stderr\": 0.04348208051644858\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5950413223140496,\n \"acc_stderr\": 0.04481137755942469,\n \"\
acc_norm\": 0.5950413223140496,\n \"acc_norm_stderr\": 0.04481137755942469\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5092592592592593,\n\
\ \"acc_stderr\": 0.04832853553437056,\n \"acc_norm\": 0.5092592592592593,\n\
\ \"acc_norm_stderr\": 0.04832853553437056\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5030674846625767,\n \"acc_stderr\": 0.03928297078179663,\n\
\ \"acc_norm\": 0.5030674846625767,\n \"acc_norm_stderr\": 0.03928297078179663\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5825242718446602,\n \"acc_stderr\": 0.048828405482122375,\n\
\ \"acc_norm\": 0.5825242718446602,\n \"acc_norm_stderr\": 0.048828405482122375\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.717948717948718,\n\
\ \"acc_stderr\": 0.029480360549541194,\n \"acc_norm\": 0.717948717948718,\n\
\ \"acc_norm_stderr\": 0.029480360549541194\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.05021167315686779,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.05021167315686779\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6704980842911877,\n\
\ \"acc_stderr\": 0.016808322261740456,\n \"acc_norm\": 0.6704980842911877,\n\
\ \"acc_norm_stderr\": 0.016808322261740456\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.49710982658959535,\n \"acc_stderr\": 0.026918645383239015,\n\
\ \"acc_norm\": 0.49710982658959535,\n \"acc_norm_stderr\": 0.026918645383239015\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23910614525139665,\n\
\ \"acc_stderr\": 0.014265554192331144,\n \"acc_norm\": 0.23910614525139665,\n\
\ \"acc_norm_stderr\": 0.014265554192331144\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.45751633986928103,\n \"acc_stderr\": 0.028526383452142638,\n\
\ \"acc_norm\": 0.45751633986928103,\n \"acc_norm_stderr\": 0.028526383452142638\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5498392282958199,\n\
\ \"acc_stderr\": 0.02825666072336018,\n \"acc_norm\": 0.5498392282958199,\n\
\ \"acc_norm_stderr\": 0.02825666072336018\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5246913580246914,\n \"acc_stderr\": 0.027786800931427443,\n\
\ \"acc_norm\": 0.5246913580246914,\n \"acc_norm_stderr\": 0.027786800931427443\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.39361702127659576,\n \"acc_stderr\": 0.029144544781596154,\n \
\ \"acc_norm\": 0.39361702127659576,\n \"acc_norm_stderr\": 0.029144544781596154\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3709256844850065,\n\
\ \"acc_stderr\": 0.012337391684530312,\n \"acc_norm\": 0.3709256844850065,\n\
\ \"acc_norm_stderr\": 0.012337391684530312\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4632352941176471,\n \"acc_stderr\": 0.030290619180485697,\n\
\ \"acc_norm\": 0.4632352941176471,\n \"acc_norm_stderr\": 0.030290619180485697\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.47058823529411764,\n \"acc_stderr\": 0.020192808271433795,\n \
\ \"acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.020192808271433795\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5818181818181818,\n\
\ \"acc_stderr\": 0.04724577405731571,\n \"acc_norm\": 0.5818181818181818,\n\
\ \"acc_norm_stderr\": 0.04724577405731571\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4816326530612245,\n \"acc_stderr\": 0.031987615467631264,\n\
\ \"acc_norm\": 0.4816326530612245,\n \"acc_norm_stderr\": 0.031987615467631264\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6318407960199005,\n\
\ \"acc_stderr\": 0.03410410565495301,\n \"acc_norm\": 0.6318407960199005,\n\
\ \"acc_norm_stderr\": 0.03410410565495301\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \
\ \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3614457831325301,\n\
\ \"acc_stderr\": 0.037400593820293204,\n \"acc_norm\": 0.3614457831325301,\n\
\ \"acc_norm_stderr\": 0.037400593820293204\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6608187134502924,\n \"acc_stderr\": 0.03631053496488904,\n\
\ \"acc_norm\": 0.6608187134502924,\n \"acc_norm_stderr\": 0.03631053496488904\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3635250917992656,\n\
\ \"mc1_stderr\": 0.016838862883965827,\n \"mc2\": 0.531348782604687,\n\
\ \"mc2_stderr\": 0.015530602873554405\n }\n}\n```"
repo_url: https://huggingface.co/beaugogh/Llama2-7b-openorca-mc-v2-dpo
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_08T19_52_28.810718
path:
- '**/details_harness|arc:challenge|25_2023-10-08T19-52-28.810718.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-08T19-52-28.810718.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_08T19_52_28.810718
path:
- '**/details_harness|hellaswag|10_2023-10-08T19-52-28.810718.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-08T19-52-28.810718.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_08T19_52_28.810718
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T19-52-28.810718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T19-52-28.810718.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T19-52-28.810718.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_08T19_52_28.810718
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T19-52-28.810718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T19-52-28.810718.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_08T19_52_28.810718
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T19-52-28.810718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T19-52-28.810718.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_08T19_52_28.810718
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T19-52-28.810718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T19-52-28.810718.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_08T19_52_28.810718
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T19-52-28.810718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T19-52-28.810718.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_08T19_52_28.810718
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T19-52-28.810718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T19-52-28.810718.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_08T19_52_28.810718
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T19-52-28.810718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T19-52-28.810718.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_08T19_52_28.810718
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T19-52-28.810718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T19-52-28.810718.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_08T19_52_28.810718
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T19-52-28.810718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T19-52-28.810718.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_08T19_52_28.810718
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T19-52-28.810718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T19-52-28.810718.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_08T19_52_28.810718
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T19-52-28.810718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T19-52-28.810718.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_08T19_52_28.810718
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T19-52-28.810718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T19-52-28.810718.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_08T19_52_28.810718
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T19-52-28.810718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T19-52-28.810718.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_08T19_52_28.810718
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T19-52-28.810718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T19-52-28.810718.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_08T19_52_28.810718
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T19-52-28.810718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T19-52-28.810718.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_08T19_52_28.810718
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T19-52-28.810718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T19-52-28.810718.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_08T19_52_28.810718
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T19-52-28.810718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T19-52-28.810718.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_08T19_52_28.810718
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T19-52-28.810718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T19-52-28.810718.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_08T19_52_28.810718
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T19-52-28.810718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T19-52-28.810718.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_08T19_52_28.810718
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T19-52-28.810718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T19-52-28.810718.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_08T19_52_28.810718
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T19-52-28.810718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T19-52-28.810718.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_08T19_52_28.810718
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T19-52-28.810718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T19-52-28.810718.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_08T19_52_28.810718
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T19-52-28.810718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T19-52-28.810718.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_08T19_52_28.810718
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T19-52-28.810718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T19-52-28.810718.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_08T19_52_28.810718
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T19-52-28.810718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T19-52-28.810718.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_08T19_52_28.810718
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T19-52-28.810718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T19-52-28.810718.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_08T19_52_28.810718
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T19-52-28.810718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T19-52-28.810718.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_08T19_52_28.810718
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T19-52-28.810718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T19-52-28.810718.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_08T19_52_28.810718
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T19-52-28.810718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T19-52-28.810718.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_08T19_52_28.810718
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T19-52-28.810718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T19-52-28.810718.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_08T19_52_28.810718
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T19-52-28.810718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T19-52-28.810718.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_08T19_52_28.810718
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T19-52-28.810718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T19-52-28.810718.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_08T19_52_28.810718
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T19-52-28.810718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T19-52-28.810718.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_08T19_52_28.810718
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T19-52-28.810718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T19-52-28.810718.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_08T19_52_28.810718
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T19-52-28.810718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T19-52-28.810718.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_08T19_52_28.810718
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T19-52-28.810718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T19-52-28.810718.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_08T19_52_28.810718
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T19-52-28.810718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T19-52-28.810718.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_08T19_52_28.810718
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T19-52-28.810718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T19-52-28.810718.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_08T19_52_28.810718
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T19-52-28.810718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T19-52-28.810718.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_08T19_52_28.810718
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-08T19-52-28.810718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-08T19-52-28.810718.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_08T19_52_28.810718
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T19-52-28.810718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T19-52-28.810718.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_08T19_52_28.810718
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T19-52-28.810718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T19-52-28.810718.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_08T19_52_28.810718
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T19-52-28.810718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T19-52-28.810718.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_08T19_52_28.810718
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T19-52-28.810718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T19-52-28.810718.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_08T19_52_28.810718
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T19-52-28.810718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T19-52-28.810718.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_08T19_52_28.810718
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T19-52-28.810718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T19-52-28.810718.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_08T19_52_28.810718
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T19-52-28.810718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T19-52-28.810718.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_08T19_52_28.810718
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T19-52-28.810718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T19-52-28.810718.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_08T19_52_28.810718
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T19-52-28.810718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T19-52-28.810718.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_08T19_52_28.810718
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T19-52-28.810718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T19-52-28.810718.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_08T19_52_28.810718
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T19-52-28.810718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T19-52-28.810718.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_08T19_52_28.810718
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T19-52-28.810718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T19-52-28.810718.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_08T19_52_28.810718
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T19-52-28.810718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T19-52-28.810718.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_08T19_52_28.810718
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T19-52-28.810718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T19-52-28.810718.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_08T19_52_28.810718
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T19-52-28.810718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T19-52-28.810718.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_08T19_52_28.810718
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T19-52-28.810718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T19-52-28.810718.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_08T19_52_28.810718
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T19-52-28.810718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T19-52-28.810718.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_08T19_52_28.810718
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T19-52-28.810718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T19-52-28.810718.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_08T19_52_28.810718
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-08T19-52-28.810718.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-08T19-52-28.810718.parquet'
- config_name: results
data_files:
- split: 2023_10_08T19_52_28.810718
path:
- results_2023-10-08T19-52-28.810718.parquet
- split: latest
path:
- results_2023-10-08T19-52-28.810718.parquet
---
# Dataset Card for Evaluation run of beaugogh/Llama2-7b-openorca-mc-v2-dpo
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/beaugogh/Llama2-7b-openorca-mc-v2-dpo
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [beaugogh/Llama2-7b-openorca-mc-v2-dpo](https://huggingface.co/beaugogh/Llama2-7b-openorca-mc-v2-dpo) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_beaugogh__Llama2-7b-openorca-mc-v2-dpo",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-08T19:52:28.810718](https://huggingface.co/datasets/open-llm-leaderboard/details_beaugogh__Llama2-7b-openorca-mc-v2-dpo/blob/main/results_2023-10-08T19-52-28.810718.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4750712786013465,
"acc_stderr": 0.03526846447027585,
"acc_norm": 0.4790877754279391,
"acc_norm_stderr": 0.035251105875716136,
"mc1": 0.3635250917992656,
"mc1_stderr": 0.016838862883965827,
"mc2": 0.531348782604687,
"mc2_stderr": 0.015530602873554405
},
"harness|arc:challenge|25": {
"acc": 0.5017064846416383,
"acc_stderr": 0.01461130570505699,
"acc_norm": 0.5477815699658704,
"acc_norm_stderr": 0.014544519880633829
},
"harness|hellaswag|10": {
"acc": 0.6238797052380004,
"acc_stderr": 0.00483420796406132,
"acc_norm": 0.8147779326827326,
"acc_norm_stderr": 0.0038768367094611364
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.43703703703703706,
"acc_stderr": 0.04284958639753399,
"acc_norm": 0.43703703703703706,
"acc_norm_stderr": 0.04284958639753399
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.48026315789473684,
"acc_stderr": 0.040657710025626036,
"acc_norm": 0.48026315789473684,
"acc_norm_stderr": 0.040657710025626036
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5547169811320755,
"acc_stderr": 0.030588052974270655,
"acc_norm": 0.5547169811320755,
"acc_norm_stderr": 0.030588052974270655
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4513888888888889,
"acc_stderr": 0.04161402398403279,
"acc_norm": 0.4513888888888889,
"acc_norm_stderr": 0.04161402398403279
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4277456647398844,
"acc_stderr": 0.03772446857518026,
"acc_norm": 0.4277456647398844,
"acc_norm_stderr": 0.03772446857518026
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.04389869956808778,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.04389869956808778
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4425531914893617,
"acc_stderr": 0.032469569197899575,
"acc_norm": 0.4425531914893617,
"acc_norm_stderr": 0.032469569197899575
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.22807017543859648,
"acc_stderr": 0.03947152782669415,
"acc_norm": 0.22807017543859648,
"acc_norm_stderr": 0.03947152782669415
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4689655172413793,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.4689655172413793,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3306878306878307,
"acc_stderr": 0.024229965298425072,
"acc_norm": 0.3306878306878307,
"acc_norm_stderr": 0.024229965298425072
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04216370213557835,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04216370213557835
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5129032258064516,
"acc_stderr": 0.02843453315268186,
"acc_norm": 0.5129032258064516,
"acc_norm_stderr": 0.02843453315268186
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3793103448275862,
"acc_stderr": 0.03413963805906235,
"acc_norm": 0.3793103448275862,
"acc_norm_stderr": 0.03413963805906235
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5878787878787879,
"acc_stderr": 0.03843566993588718,
"acc_norm": 0.5878787878787879,
"acc_norm_stderr": 0.03843566993588718
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5303030303030303,
"acc_stderr": 0.0355580405176393,
"acc_norm": 0.5303030303030303,
"acc_norm_stderr": 0.0355580405176393
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6787564766839378,
"acc_stderr": 0.033699508685490674,
"acc_norm": 0.6787564766839378,
"acc_norm_stderr": 0.033699508685490674
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.44871794871794873,
"acc_stderr": 0.025217315184846482,
"acc_norm": 0.44871794871794873,
"acc_norm_stderr": 0.025217315184846482
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.28888888888888886,
"acc_stderr": 0.027634907264178544,
"acc_norm": 0.28888888888888886,
"acc_norm_stderr": 0.027634907264178544
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.42436974789915966,
"acc_stderr": 0.03210479051015776,
"acc_norm": 0.42436974789915966,
"acc_norm_stderr": 0.03210479051015776
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6403669724770642,
"acc_stderr": 0.020575234660123776,
"acc_norm": 0.6403669724770642,
"acc_norm_stderr": 0.020575234660123776
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.28703703703703703,
"acc_stderr": 0.030851992993257013,
"acc_norm": 0.28703703703703703,
"acc_norm_stderr": 0.030851992993257013
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6519607843137255,
"acc_stderr": 0.03343311240488418,
"acc_norm": 0.6519607843137255,
"acc_norm_stderr": 0.03343311240488418
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7046413502109705,
"acc_stderr": 0.02969633871342288,
"acc_norm": 0.7046413502109705,
"acc_norm_stderr": 0.02969633871342288
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5650224215246636,
"acc_stderr": 0.033272833702713445,
"acc_norm": 0.5650224215246636,
"acc_norm_stderr": 0.033272833702713445
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5648854961832062,
"acc_stderr": 0.04348208051644858,
"acc_norm": 0.5648854961832062,
"acc_norm_stderr": 0.04348208051644858
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5950413223140496,
"acc_stderr": 0.04481137755942469,
"acc_norm": 0.5950413223140496,
"acc_norm_stderr": 0.04481137755942469
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.04832853553437056,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.04832853553437056
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5030674846625767,
"acc_stderr": 0.03928297078179663,
"acc_norm": 0.5030674846625767,
"acc_norm_stderr": 0.03928297078179663
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.5825242718446602,
"acc_stderr": 0.048828405482122375,
"acc_norm": 0.5825242718446602,
"acc_norm_stderr": 0.048828405482122375
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.717948717948718,
"acc_stderr": 0.029480360549541194,
"acc_norm": 0.717948717948718,
"acc_norm_stderr": 0.029480360549541194
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.52,
"acc_stderr": 0.05021167315686779,
"acc_norm": 0.52,
"acc_norm_stderr": 0.05021167315686779
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6704980842911877,
"acc_stderr": 0.016808322261740456,
"acc_norm": 0.6704980842911877,
"acc_norm_stderr": 0.016808322261740456
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.49710982658959535,
"acc_stderr": 0.026918645383239015,
"acc_norm": 0.49710982658959535,
"acc_norm_stderr": 0.026918645383239015
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23910614525139665,
"acc_stderr": 0.014265554192331144,
"acc_norm": 0.23910614525139665,
"acc_norm_stderr": 0.014265554192331144
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.45751633986928103,
"acc_stderr": 0.028526383452142638,
"acc_norm": 0.45751633986928103,
"acc_norm_stderr": 0.028526383452142638
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5498392282958199,
"acc_stderr": 0.02825666072336018,
"acc_norm": 0.5498392282958199,
"acc_norm_stderr": 0.02825666072336018
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5246913580246914,
"acc_stderr": 0.027786800931427443,
"acc_norm": 0.5246913580246914,
"acc_norm_stderr": 0.027786800931427443
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.39361702127659576,
"acc_stderr": 0.029144544781596154,
"acc_norm": 0.39361702127659576,
"acc_norm_stderr": 0.029144544781596154
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3709256844850065,
"acc_stderr": 0.012337391684530312,
"acc_norm": 0.3709256844850065,
"acc_norm_stderr": 0.012337391684530312
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4632352941176471,
"acc_stderr": 0.030290619180485697,
"acc_norm": 0.4632352941176471,
"acc_norm_stderr": 0.030290619180485697
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.47058823529411764,
"acc_stderr": 0.020192808271433795,
"acc_norm": 0.47058823529411764,
"acc_norm_stderr": 0.020192808271433795
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5818181818181818,
"acc_stderr": 0.04724577405731571,
"acc_norm": 0.5818181818181818,
"acc_norm_stderr": 0.04724577405731571
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4816326530612245,
"acc_stderr": 0.031987615467631264,
"acc_norm": 0.4816326530612245,
"acc_norm_stderr": 0.031987615467631264
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6318407960199005,
"acc_stderr": 0.03410410565495301,
"acc_norm": 0.6318407960199005,
"acc_norm_stderr": 0.03410410565495301
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3614457831325301,
"acc_stderr": 0.037400593820293204,
"acc_norm": 0.3614457831325301,
"acc_norm_stderr": 0.037400593820293204
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6608187134502924,
"acc_stderr": 0.03631053496488904,
"acc_norm": 0.6608187134502924,
"acc_norm_stderr": 0.03631053496488904
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3635250917992656,
"mc1_stderr": 0.016838862883965827,
"mc2": 0.531348782604687,
"mc2_stderr": 0.015530602873554405
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
DaviGamer/KennyMcCormick | 2023-10-08T20:39:47.000Z | [
"license:openrail",
"region:us"
] | DaviGamer | null | null | null | 0 | 0 | ---
license: openrail
---
|
open-llm-leaderboard/details_ibranze__araproje-llama2-7b-hf | 2023-10-08T20:05:56.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of ibranze/araproje-llama2-7b-hf
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ibranze/araproje-llama2-7b-hf](https://huggingface.co/ibranze/araproje-llama2-7b-hf)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ibranze__araproje-llama2-7b-hf\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-08T20:04:34.106747](https://huggingface.co/datasets/open-llm-leaderboard/details_ibranze__araproje-llama2-7b-hf/blob/main/results_2023-10-08T20-04-34.106747.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.47043597201725107,\n\
\ \"acc_stderr\": 0.03529263908245757,\n \"acc_norm\": 0.47444479585837357,\n\
\ \"acc_norm_stderr\": 0.03527837427331349,\n \"mc1\": 0.2484700122399021,\n\
\ \"mc1_stderr\": 0.01512742709652068,\n \"mc2\": 0.3875084099562216,\n\
\ \"mc2_stderr\": 0.013510147651392562\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.49146757679180886,\n \"acc_stderr\": 0.01460926316563219,\n\
\ \"acc_norm\": 0.5307167235494881,\n \"acc_norm_stderr\": 0.014583792546304037\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5884285998805019,\n\
\ \"acc_stderr\": 0.0049111251010646425,\n \"acc_norm\": 0.785700059749054,\n\
\ \"acc_norm_stderr\": 0.004094971980892084\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.40789473684210525,\n \"acc_stderr\": 0.03999309712777471,\n\
\ \"acc_norm\": 0.40789473684210525,\n \"acc_norm_stderr\": 0.03999309712777471\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4641509433962264,\n \"acc_stderr\": 0.030693675018458003,\n\
\ \"acc_norm\": 0.4641509433962264,\n \"acc_norm_stderr\": 0.030693675018458003\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4652777777777778,\n\
\ \"acc_stderr\": 0.04171115858181618,\n \"acc_norm\": 0.4652777777777778,\n\
\ \"acc_norm_stderr\": 0.04171115858181618\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.33,\n\
\ \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \
\ \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4277456647398844,\n\
\ \"acc_stderr\": 0.037724468575180255,\n \"acc_norm\": 0.4277456647398844,\n\
\ \"acc_norm_stderr\": 0.037724468575180255\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.04280105837364395,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.04280105837364395\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n\
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.42127659574468085,\n \"acc_stderr\": 0.03227834510146267,\n\
\ \"acc_norm\": 0.42127659574468085,\n \"acc_norm_stderr\": 0.03227834510146267\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.042270544512322004,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.042270544512322004\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.47586206896551725,\n \"acc_stderr\": 0.041618085035015295,\n\
\ \"acc_norm\": 0.47586206896551725,\n \"acc_norm_stderr\": 0.041618085035015295\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2671957671957672,\n \"acc_stderr\": 0.02278967314577656,\n \"\
acc_norm\": 0.2671957671957672,\n \"acc_norm_stderr\": 0.02278967314577656\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.0404061017820884,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.0404061017820884\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621503,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621503\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.028444006199428714,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.028444006199428714\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3645320197044335,\n \"acc_stderr\": 0.033864057460620905,\n\
\ \"acc_norm\": 0.3645320197044335,\n \"acc_norm_stderr\": 0.033864057460620905\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\"\
: 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.593939393939394,\n \"acc_stderr\": 0.03834816355401181,\n\
\ \"acc_norm\": 0.593939393939394,\n \"acc_norm_stderr\": 0.03834816355401181\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.4898989898989899,\n \"acc_stderr\": 0.03561625488673745,\n \"\
acc_norm\": 0.4898989898989899,\n \"acc_norm_stderr\": 0.03561625488673745\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6787564766839378,\n \"acc_stderr\": 0.033699508685490674,\n\
\ \"acc_norm\": 0.6787564766839378,\n \"acc_norm_stderr\": 0.033699508685490674\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.45897435897435895,\n \"acc_stderr\": 0.025265525491284295,\n\
\ \"acc_norm\": 0.45897435897435895,\n \"acc_norm_stderr\": 0.025265525491284295\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3,\n \"acc_stderr\": 0.027940457136228416,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.027940457136228416\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.0322529423239964,\n\
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.0322529423239964\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.037804458505267334,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.037804458505267334\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6311926605504588,\n \"acc_stderr\": 0.020686227560729555,\n \"\
acc_norm\": 0.6311926605504588,\n \"acc_norm_stderr\": 0.020686227560729555\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.27314814814814814,\n \"acc_stderr\": 0.03038805130167812,\n \"\
acc_norm\": 0.27314814814814814,\n \"acc_norm_stderr\": 0.03038805130167812\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5441176470588235,\n \"acc_stderr\": 0.03495624522015476,\n \"\
acc_norm\": 0.5441176470588235,\n \"acc_norm_stderr\": 0.03495624522015476\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6329113924050633,\n \"acc_stderr\": 0.031376240725616185,\n \
\ \"acc_norm\": 0.6329113924050633,\n \"acc_norm_stderr\": 0.031376240725616185\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5650224215246636,\n\
\ \"acc_stderr\": 0.033272833702713445,\n \"acc_norm\": 0.5650224215246636,\n\
\ \"acc_norm_stderr\": 0.033272833702713445\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5648854961832062,\n \"acc_stderr\": 0.04348208051644858,\n\
\ \"acc_norm\": 0.5648854961832062,\n \"acc_norm_stderr\": 0.04348208051644858\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6528925619834711,\n \"acc_stderr\": 0.043457245702925335,\n \"\
acc_norm\": 0.6528925619834711,\n \"acc_norm_stderr\": 0.043457245702925335\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5370370370370371,\n\
\ \"acc_stderr\": 0.04820403072760628,\n \"acc_norm\": 0.5370370370370371,\n\
\ \"acc_norm_stderr\": 0.04820403072760628\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5153374233128835,\n \"acc_stderr\": 0.03926522378708843,\n\
\ \"acc_norm\": 0.5153374233128835,\n \"acc_norm_stderr\": 0.03926522378708843\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n\
\ \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n\
\ \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5533980582524272,\n \"acc_stderr\": 0.04922424153458933,\n\
\ \"acc_norm\": 0.5533980582524272,\n \"acc_norm_stderr\": 0.04922424153458933\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6923076923076923,\n\
\ \"acc_stderr\": 0.030236389942173085,\n \"acc_norm\": 0.6923076923076923,\n\
\ \"acc_norm_stderr\": 0.030236389942173085\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.55,\n \"acc_stderr\": 0.04999999999999999,\n \
\ \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.04999999999999999\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6398467432950191,\n\
\ \"acc_stderr\": 0.017166362471369306,\n \"acc_norm\": 0.6398467432950191,\n\
\ \"acc_norm_stderr\": 0.017166362471369306\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.49421965317919075,\n \"acc_stderr\": 0.026917296179149116,\n\
\ \"acc_norm\": 0.49421965317919075,\n \"acc_norm_stderr\": 0.026917296179149116\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23910614525139665,\n\
\ \"acc_stderr\": 0.014265554192331144,\n \"acc_norm\": 0.23910614525139665,\n\
\ \"acc_norm_stderr\": 0.014265554192331144\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.49673202614379086,\n \"acc_stderr\": 0.02862930519400354,\n\
\ \"acc_norm\": 0.49673202614379086,\n \"acc_norm_stderr\": 0.02862930519400354\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6012861736334405,\n\
\ \"acc_stderr\": 0.0278093225857745,\n \"acc_norm\": 0.6012861736334405,\n\
\ \"acc_norm_stderr\": 0.0278093225857745\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.4876543209876543,\n \"acc_stderr\": 0.027812262269327228,\n\
\ \"acc_norm\": 0.4876543209876543,\n \"acc_norm_stderr\": 0.027812262269327228\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3617021276595745,\n \"acc_stderr\": 0.028663820147199492,\n \
\ \"acc_norm\": 0.3617021276595745,\n \"acc_norm_stderr\": 0.028663820147199492\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.36114732724902215,\n\
\ \"acc_stderr\": 0.01226793547751903,\n \"acc_norm\": 0.36114732724902215,\n\
\ \"acc_norm_stderr\": 0.01226793547751903\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5257352941176471,\n \"acc_stderr\": 0.03033257809455504,\n\
\ \"acc_norm\": 0.5257352941176471,\n \"acc_norm_stderr\": 0.03033257809455504\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4411764705882353,\n \"acc_stderr\": 0.020087362076702857,\n \
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.020087362076702857\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5272727272727272,\n\
\ \"acc_stderr\": 0.04782001791380061,\n \"acc_norm\": 0.5272727272727272,\n\
\ \"acc_norm_stderr\": 0.04782001791380061\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4775510204081633,\n \"acc_stderr\": 0.031976941187136725,\n\
\ \"acc_norm\": 0.4775510204081633,\n \"acc_norm_stderr\": 0.031976941187136725\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6318407960199005,\n\
\ \"acc_stderr\": 0.03410410565495301,\n \"acc_norm\": 0.6318407960199005,\n\
\ \"acc_norm_stderr\": 0.03410410565495301\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42168674698795183,\n\
\ \"acc_stderr\": 0.03844453181770917,\n \"acc_norm\": 0.42168674698795183,\n\
\ \"acc_norm_stderr\": 0.03844453181770917\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7017543859649122,\n \"acc_stderr\": 0.03508771929824563,\n\
\ \"acc_norm\": 0.7017543859649122,\n \"acc_norm_stderr\": 0.03508771929824563\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2484700122399021,\n\
\ \"mc1_stderr\": 0.01512742709652068,\n \"mc2\": 0.3875084099562216,\n\
\ \"mc2_stderr\": 0.013510147651392562\n }\n}\n```"
repo_url: https://huggingface.co/ibranze/araproje-llama2-7b-hf
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_08T20_04_34.106747
path:
- '**/details_harness|arc:challenge|25_2023-10-08T20-04-34.106747.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-08T20-04-34.106747.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_08T20_04_34.106747
path:
- '**/details_harness|hellaswag|10_2023-10-08T20-04-34.106747.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-08T20-04-34.106747.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_08T20_04_34.106747
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T20-04-34.106747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T20-04-34.106747.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T20-04-34.106747.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_08T20_04_34.106747
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T20-04-34.106747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T20-04-34.106747.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_08T20_04_34.106747
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T20-04-34.106747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T20-04-34.106747.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_08T20_04_34.106747
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T20-04-34.106747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T20-04-34.106747.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_08T20_04_34.106747
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T20-04-34.106747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T20-04-34.106747.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_08T20_04_34.106747
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T20-04-34.106747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T20-04-34.106747.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_08T20_04_34.106747
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T20-04-34.106747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T20-04-34.106747.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_08T20_04_34.106747
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T20-04-34.106747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T20-04-34.106747.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_08T20_04_34.106747
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T20-04-34.106747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T20-04-34.106747.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_08T20_04_34.106747
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T20-04-34.106747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T20-04-34.106747.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_08T20_04_34.106747
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T20-04-34.106747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T20-04-34.106747.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_08T20_04_34.106747
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T20-04-34.106747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T20-04-34.106747.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_08T20_04_34.106747
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T20-04-34.106747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T20-04-34.106747.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_08T20_04_34.106747
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T20-04-34.106747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T20-04-34.106747.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_08T20_04_34.106747
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T20-04-34.106747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T20-04-34.106747.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_08T20_04_34.106747
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T20-04-34.106747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T20-04-34.106747.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_08T20_04_34.106747
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T20-04-34.106747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T20-04-34.106747.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_08T20_04_34.106747
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T20-04-34.106747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T20-04-34.106747.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_08T20_04_34.106747
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T20-04-34.106747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T20-04-34.106747.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_08T20_04_34.106747
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T20-04-34.106747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T20-04-34.106747.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_08T20_04_34.106747
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T20-04-34.106747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T20-04-34.106747.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_08T20_04_34.106747
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T20-04-34.106747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T20-04-34.106747.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_08T20_04_34.106747
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T20-04-34.106747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T20-04-34.106747.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_08T20_04_34.106747
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T20-04-34.106747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T20-04-34.106747.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_08T20_04_34.106747
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T20-04-34.106747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T20-04-34.106747.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_08T20_04_34.106747
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T20-04-34.106747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T20-04-34.106747.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_08T20_04_34.106747
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T20-04-34.106747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T20-04-34.106747.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_08T20_04_34.106747
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T20-04-34.106747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T20-04-34.106747.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_08T20_04_34.106747
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T20-04-34.106747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T20-04-34.106747.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_08T20_04_34.106747
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T20-04-34.106747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T20-04-34.106747.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_08T20_04_34.106747
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T20-04-34.106747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T20-04-34.106747.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_08T20_04_34.106747
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T20-04-34.106747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T20-04-34.106747.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_08T20_04_34.106747
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T20-04-34.106747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T20-04-34.106747.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_08T20_04_34.106747
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T20-04-34.106747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T20-04-34.106747.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_08T20_04_34.106747
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T20-04-34.106747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T20-04-34.106747.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_08T20_04_34.106747
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T20-04-34.106747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T20-04-34.106747.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_08T20_04_34.106747
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T20-04-34.106747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T20-04-34.106747.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_08T20_04_34.106747
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T20-04-34.106747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T20-04-34.106747.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_08T20_04_34.106747
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T20-04-34.106747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T20-04-34.106747.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_08T20_04_34.106747
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-08T20-04-34.106747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-08T20-04-34.106747.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_08T20_04_34.106747
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T20-04-34.106747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T20-04-34.106747.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_08T20_04_34.106747
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T20-04-34.106747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T20-04-34.106747.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_08T20_04_34.106747
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T20-04-34.106747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T20-04-34.106747.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_08T20_04_34.106747
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T20-04-34.106747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T20-04-34.106747.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_08T20_04_34.106747
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T20-04-34.106747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T20-04-34.106747.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_08T20_04_34.106747
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T20-04-34.106747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T20-04-34.106747.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_08T20_04_34.106747
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T20-04-34.106747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T20-04-34.106747.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_08T20_04_34.106747
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T20-04-34.106747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T20-04-34.106747.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_08T20_04_34.106747
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T20-04-34.106747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T20-04-34.106747.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_08T20_04_34.106747
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T20-04-34.106747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T20-04-34.106747.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_08T20_04_34.106747
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T20-04-34.106747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T20-04-34.106747.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_08T20_04_34.106747
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T20-04-34.106747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T20-04-34.106747.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_08T20_04_34.106747
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T20-04-34.106747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T20-04-34.106747.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_08T20_04_34.106747
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T20-04-34.106747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T20-04-34.106747.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_08T20_04_34.106747
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T20-04-34.106747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T20-04-34.106747.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_08T20_04_34.106747
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T20-04-34.106747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T20-04-34.106747.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_08T20_04_34.106747
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T20-04-34.106747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T20-04-34.106747.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_08T20_04_34.106747
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T20-04-34.106747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T20-04-34.106747.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_08T20_04_34.106747
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-08T20-04-34.106747.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-08T20-04-34.106747.parquet'
- config_name: results
data_files:
- split: 2023_10_08T20_04_34.106747
path:
- results_2023-10-08T20-04-34.106747.parquet
- split: latest
path:
- results_2023-10-08T20-04-34.106747.parquet
---
# Dataset Card for Evaluation run of ibranze/araproje-llama2-7b-hf
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/ibranze/araproje-llama2-7b-hf
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [ibranze/araproje-llama2-7b-hf](https://huggingface.co/ibranze/araproje-llama2-7b-hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ibranze__araproje-llama2-7b-hf",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-08T20:04:34.106747](https://huggingface.co/datasets/open-llm-leaderboard/details_ibranze__araproje-llama2-7b-hf/blob/main/results_2023-10-08T20-04-34.106747.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.47043597201725107,
"acc_stderr": 0.03529263908245757,
"acc_norm": 0.47444479585837357,
"acc_norm_stderr": 0.03527837427331349,
"mc1": 0.2484700122399021,
"mc1_stderr": 0.01512742709652068,
"mc2": 0.3875084099562216,
"mc2_stderr": 0.013510147651392562
},
"harness|arc:challenge|25": {
"acc": 0.49146757679180886,
"acc_stderr": 0.01460926316563219,
"acc_norm": 0.5307167235494881,
"acc_norm_stderr": 0.014583792546304037
},
"harness|hellaswag|10": {
"acc": 0.5884285998805019,
"acc_stderr": 0.0049111251010646425,
"acc_norm": 0.785700059749054,
"acc_norm_stderr": 0.004094971980892084
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.40789473684210525,
"acc_stderr": 0.03999309712777471,
"acc_norm": 0.40789473684210525,
"acc_norm_stderr": 0.03999309712777471
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4641509433962264,
"acc_stderr": 0.030693675018458003,
"acc_norm": 0.4641509433962264,
"acc_norm_stderr": 0.030693675018458003
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4652777777777778,
"acc_stderr": 0.04171115858181618,
"acc_norm": 0.4652777777777778,
"acc_norm_stderr": 0.04171115858181618
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4277456647398844,
"acc_stderr": 0.037724468575180255,
"acc_norm": 0.4277456647398844,
"acc_norm_stderr": 0.037724468575180255
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.04280105837364395,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.04280105837364395
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.42127659574468085,
"acc_stderr": 0.03227834510146267,
"acc_norm": 0.42127659574468085,
"acc_norm_stderr": 0.03227834510146267
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.042270544512322004,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.042270544512322004
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.47586206896551725,
"acc_stderr": 0.041618085035015295,
"acc_norm": 0.47586206896551725,
"acc_norm_stderr": 0.041618085035015295
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2671957671957672,
"acc_stderr": 0.02278967314577656,
"acc_norm": 0.2671957671957672,
"acc_norm_stderr": 0.02278967314577656
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.0404061017820884,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.0404061017820884
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621503,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621503
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5,
"acc_stderr": 0.028444006199428714,
"acc_norm": 0.5,
"acc_norm_stderr": 0.028444006199428714
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3645320197044335,
"acc_stderr": 0.033864057460620905,
"acc_norm": 0.3645320197044335,
"acc_norm_stderr": 0.033864057460620905
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.593939393939394,
"acc_stderr": 0.03834816355401181,
"acc_norm": 0.593939393939394,
"acc_norm_stderr": 0.03834816355401181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.4898989898989899,
"acc_stderr": 0.03561625488673745,
"acc_norm": 0.4898989898989899,
"acc_norm_stderr": 0.03561625488673745
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6787564766839378,
"acc_stderr": 0.033699508685490674,
"acc_norm": 0.6787564766839378,
"acc_norm_stderr": 0.033699508685490674
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.45897435897435895,
"acc_stderr": 0.025265525491284295,
"acc_norm": 0.45897435897435895,
"acc_norm_stderr": 0.025265525491284295
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.027940457136228416,
"acc_norm": 0.3,
"acc_norm_stderr": 0.027940457136228416
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.0322529423239964,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.0322529423239964
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.037804458505267334,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.037804458505267334
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6311926605504588,
"acc_stderr": 0.020686227560729555,
"acc_norm": 0.6311926605504588,
"acc_norm_stderr": 0.020686227560729555
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.27314814814814814,
"acc_stderr": 0.03038805130167812,
"acc_norm": 0.27314814814814814,
"acc_norm_stderr": 0.03038805130167812
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5441176470588235,
"acc_stderr": 0.03495624522015476,
"acc_norm": 0.5441176470588235,
"acc_norm_stderr": 0.03495624522015476
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6329113924050633,
"acc_stderr": 0.031376240725616185,
"acc_norm": 0.6329113924050633,
"acc_norm_stderr": 0.031376240725616185
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5650224215246636,
"acc_stderr": 0.033272833702713445,
"acc_norm": 0.5650224215246636,
"acc_norm_stderr": 0.033272833702713445
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5648854961832062,
"acc_stderr": 0.04348208051644858,
"acc_norm": 0.5648854961832062,
"acc_norm_stderr": 0.04348208051644858
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6528925619834711,
"acc_stderr": 0.043457245702925335,
"acc_norm": 0.6528925619834711,
"acc_norm_stderr": 0.043457245702925335
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.04820403072760628,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.04820403072760628
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5153374233128835,
"acc_stderr": 0.03926522378708843,
"acc_norm": 0.5153374233128835,
"acc_norm_stderr": 0.03926522378708843
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.5533980582524272,
"acc_stderr": 0.04922424153458933,
"acc_norm": 0.5533980582524272,
"acc_norm_stderr": 0.04922424153458933
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6923076923076923,
"acc_stderr": 0.030236389942173085,
"acc_norm": 0.6923076923076923,
"acc_norm_stderr": 0.030236389942173085
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.55,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.55,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6398467432950191,
"acc_stderr": 0.017166362471369306,
"acc_norm": 0.6398467432950191,
"acc_norm_stderr": 0.017166362471369306
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.49421965317919075,
"acc_stderr": 0.026917296179149116,
"acc_norm": 0.49421965317919075,
"acc_norm_stderr": 0.026917296179149116
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23910614525139665,
"acc_stderr": 0.014265554192331144,
"acc_norm": 0.23910614525139665,
"acc_norm_stderr": 0.014265554192331144
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.49673202614379086,
"acc_stderr": 0.02862930519400354,
"acc_norm": 0.49673202614379086,
"acc_norm_stderr": 0.02862930519400354
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6012861736334405,
"acc_stderr": 0.0278093225857745,
"acc_norm": 0.6012861736334405,
"acc_norm_stderr": 0.0278093225857745
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4876543209876543,
"acc_stderr": 0.027812262269327228,
"acc_norm": 0.4876543209876543,
"acc_norm_stderr": 0.027812262269327228
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3617021276595745,
"acc_stderr": 0.028663820147199492,
"acc_norm": 0.3617021276595745,
"acc_norm_stderr": 0.028663820147199492
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.36114732724902215,
"acc_stderr": 0.01226793547751903,
"acc_norm": 0.36114732724902215,
"acc_norm_stderr": 0.01226793547751903
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5257352941176471,
"acc_stderr": 0.03033257809455504,
"acc_norm": 0.5257352941176471,
"acc_norm_stderr": 0.03033257809455504
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.020087362076702857,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.020087362076702857
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5272727272727272,
"acc_stderr": 0.04782001791380061,
"acc_norm": 0.5272727272727272,
"acc_norm_stderr": 0.04782001791380061
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4775510204081633,
"acc_stderr": 0.031976941187136725,
"acc_norm": 0.4775510204081633,
"acc_norm_stderr": 0.031976941187136725
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6318407960199005,
"acc_stderr": 0.03410410565495301,
"acc_norm": 0.6318407960199005,
"acc_norm_stderr": 0.03410410565495301
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42168674698795183,
"acc_stderr": 0.03844453181770917,
"acc_norm": 0.42168674698795183,
"acc_norm_stderr": 0.03844453181770917
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7017543859649122,
"acc_stderr": 0.03508771929824563,
"acc_norm": 0.7017543859649122,
"acc_norm_stderr": 0.03508771929824563
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2484700122399021,
"mc1_stderr": 0.01512742709652068,
"mc2": 0.3875084099562216,
"mc2_stderr": 0.013510147651392562
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_PygmalionAI__pygmalion-2-7b | 2023-10-08T20:24:06.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of PygmalionAI/pygmalion-2-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [PygmalionAI/pygmalion-2-7b](https://huggingface.co/PygmalionAI/pygmalion-2-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PygmalionAI__pygmalion-2-7b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-08T20:22:41.887829](https://huggingface.co/datasets/open-llm-leaderboard/details_PygmalionAI__pygmalion-2-7b/blob/main/results_2023-10-08T20-22-41.887829.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.49287508581343986,\n\
\ \"acc_stderr\": 0.03498236330341224,\n \"acc_norm\": 0.4968617374915938,\n\
\ \"acc_norm_stderr\": 0.03496801839069801,\n \"mc1\": 0.2864137086903305,\n\
\ \"mc1_stderr\": 0.01582614243950235,\n \"mc2\": 0.4378491944225627,\n\
\ \"mc2_stderr\": 0.014339426796993134\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5025597269624573,\n \"acc_stderr\": 0.01461119932984378,\n\
\ \"acc_norm\": 0.5401023890784983,\n \"acc_norm_stderr\": 0.01456431885692485\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5846444931288588,\n\
\ \"acc_stderr\": 0.004917761181740162,\n \"acc_norm\": 0.7823142800238996,\n\
\ \"acc_norm_stderr\": 0.004118291804519085\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5259259259259259,\n\
\ \"acc_stderr\": 0.04313531696750575,\n \"acc_norm\": 0.5259259259259259,\n\
\ \"acc_norm_stderr\": 0.04313531696750575\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4144736842105263,\n \"acc_stderr\": 0.04008973785779206,\n\
\ \"acc_norm\": 0.4144736842105263,\n \"acc_norm_stderr\": 0.04008973785779206\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4981132075471698,\n \"acc_stderr\": 0.030772653642075664,\n\
\ \"acc_norm\": 0.4981132075471698,\n \"acc_norm_stderr\": 0.030772653642075664\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4930555555555556,\n\
\ \"acc_stderr\": 0.04180806750294938,\n \"acc_norm\": 0.4930555555555556,\n\
\ \"acc_norm_stderr\": 0.04180806750294938\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\":\
\ 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4161849710982659,\n\
\ \"acc_stderr\": 0.03758517775404947,\n \"acc_norm\": 0.4161849710982659,\n\
\ \"acc_norm_stderr\": 0.03758517775404947\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.13725490196078433,\n \"acc_stderr\": 0.03424084669891521,\n\
\ \"acc_norm\": 0.13725490196078433,\n \"acc_norm_stderr\": 0.03424084669891521\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n\
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.425531914893617,\n \"acc_stderr\": 0.03232146916224468,\n\
\ \"acc_norm\": 0.425531914893617,\n \"acc_norm_stderr\": 0.03232146916224468\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.042270544512322004,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.042270544512322004\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30687830687830686,\n \"acc_stderr\": 0.023752928712112143,\n \"\
acc_norm\": 0.30687830687830686,\n \"acc_norm_stderr\": 0.023752928712112143\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.040406101782088394,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.040406101782088394\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.5419354838709678,\n \"acc_stderr\": 0.028343787250540618,\n \"\
acc_norm\": 0.5419354838709678,\n \"acc_norm_stderr\": 0.028343787250540618\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.3448275862068966,\n \"acc_stderr\": 0.03344283744280459,\n \"\
acc_norm\": 0.3448275862068966,\n \"acc_norm_stderr\": 0.03344283744280459\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\"\
: 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6303030303030303,\n \"acc_stderr\": 0.03769430314512567,\n\
\ \"acc_norm\": 0.6303030303030303,\n \"acc_norm_stderr\": 0.03769430314512567\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6060606060606061,\n \"acc_stderr\": 0.03481285338232963,\n \"\
acc_norm\": 0.6060606060606061,\n \"acc_norm_stderr\": 0.03481285338232963\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7202072538860104,\n \"acc_stderr\": 0.032396370467357036,\n\
\ \"acc_norm\": 0.7202072538860104,\n \"acc_norm_stderr\": 0.032396370467357036\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.45384615384615384,\n \"acc_stderr\": 0.025242770987126177,\n\
\ \"acc_norm\": 0.45384615384615384,\n \"acc_norm_stderr\": 0.025242770987126177\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25925925925925924,\n \"acc_stderr\": 0.026719240783712166,\n \
\ \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.026719240783712166\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.4369747899159664,\n \"acc_stderr\": 0.03221943636566196,\n \
\ \"acc_norm\": 0.4369747899159664,\n \"acc_norm_stderr\": 0.03221943636566196\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.03757949922943342,\n \"acc_norm\"\
: 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943342\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6587155963302752,\n\
\ \"acc_stderr\": 0.020328612816592446,\n \"acc_norm\": 0.6587155963302752,\n\
\ \"acc_norm_stderr\": 0.020328612816592446\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.38425925925925924,\n \"acc_stderr\": 0.03317354514310742,\n\
\ \"acc_norm\": 0.38425925925925924,\n \"acc_norm_stderr\": 0.03317354514310742\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6617647058823529,\n \"acc_stderr\": 0.0332057461294543,\n \"acc_norm\"\
: 0.6617647058823529,\n \"acc_norm_stderr\": 0.0332057461294543\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.7172995780590717,\n \"acc_stderr\": 0.029312814153955924,\n \"\
acc_norm\": 0.7172995780590717,\n \"acc_norm_stderr\": 0.029312814153955924\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5560538116591929,\n\
\ \"acc_stderr\": 0.03334625674242728,\n \"acc_norm\": 0.5560538116591929,\n\
\ \"acc_norm_stderr\": 0.03334625674242728\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6030534351145038,\n \"acc_stderr\": 0.04291135671009224,\n\
\ \"acc_norm\": 0.6030534351145038,\n \"acc_norm_stderr\": 0.04291135671009224\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6115702479338843,\n \"acc_stderr\": 0.04449270350068382,\n \"\
acc_norm\": 0.6115702479338843,\n \"acc_norm_stderr\": 0.04449270350068382\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5462962962962963,\n\
\ \"acc_stderr\": 0.04812917324536823,\n \"acc_norm\": 0.5462962962962963,\n\
\ \"acc_norm_stderr\": 0.04812917324536823\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5644171779141104,\n \"acc_stderr\": 0.03895632464138936,\n\
\ \"acc_norm\": 0.5644171779141104,\n \"acc_norm_stderr\": 0.03895632464138936\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n\
\ \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n\
\ \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6310679611650486,\n \"acc_stderr\": 0.0477761518115674,\n\
\ \"acc_norm\": 0.6310679611650486,\n \"acc_norm_stderr\": 0.0477761518115674\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7264957264957265,\n\
\ \"acc_stderr\": 0.02920254015343117,\n \"acc_norm\": 0.7264957264957265,\n\
\ \"acc_norm_stderr\": 0.02920254015343117\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6832694763729247,\n\
\ \"acc_stderr\": 0.016635566427712568,\n \"acc_norm\": 0.6832694763729247,\n\
\ \"acc_norm_stderr\": 0.016635566427712568\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5289017341040463,\n \"acc_stderr\": 0.026874085883518348,\n\
\ \"acc_norm\": 0.5289017341040463,\n \"acc_norm_stderr\": 0.026874085883518348\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2536312849162011,\n\
\ \"acc_stderr\": 0.014551553659369916,\n \"acc_norm\": 0.2536312849162011,\n\
\ \"acc_norm_stderr\": 0.014551553659369916\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5490196078431373,\n \"acc_stderr\": 0.028491993586171566,\n\
\ \"acc_norm\": 0.5490196078431373,\n \"acc_norm_stderr\": 0.028491993586171566\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.617363344051447,\n\
\ \"acc_stderr\": 0.02760468902858199,\n \"acc_norm\": 0.617363344051447,\n\
\ \"acc_norm_stderr\": 0.02760468902858199\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5185185185185185,\n \"acc_stderr\": 0.02780165621232366,\n\
\ \"acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.02780165621232366\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3900709219858156,\n \"acc_stderr\": 0.029097675599463926,\n \
\ \"acc_norm\": 0.3900709219858156,\n \"acc_norm_stderr\": 0.029097675599463926\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3670143415906128,\n\
\ \"acc_stderr\": 0.012310264244842132,\n \"acc_norm\": 0.3670143415906128,\n\
\ \"acc_norm_stderr\": 0.012310264244842132\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5294117647058824,\n \"acc_stderr\": 0.030320243265004137,\n\
\ \"acc_norm\": 0.5294117647058824,\n \"acc_norm_stderr\": 0.030320243265004137\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.46405228758169936,\n \"acc_stderr\": 0.020175488765484043,\n \
\ \"acc_norm\": 0.46405228758169936,\n \"acc_norm_stderr\": 0.020175488765484043\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5363636363636364,\n\
\ \"acc_stderr\": 0.04776449162396197,\n \"acc_norm\": 0.5363636363636364,\n\
\ \"acc_norm_stderr\": 0.04776449162396197\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5836734693877551,\n \"acc_stderr\": 0.03155782816556165,\n\
\ \"acc_norm\": 0.5836734693877551,\n \"acc_norm_stderr\": 0.03155782816556165\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6915422885572139,\n\
\ \"acc_stderr\": 0.03265819588512699,\n \"acc_norm\": 0.6915422885572139,\n\
\ \"acc_norm_stderr\": 0.03265819588512699\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.40963855421686746,\n\
\ \"acc_stderr\": 0.03828401115079022,\n \"acc_norm\": 0.40963855421686746,\n\
\ \"acc_norm_stderr\": 0.03828401115079022\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.695906432748538,\n \"acc_stderr\": 0.0352821125824523,\n\
\ \"acc_norm\": 0.695906432748538,\n \"acc_norm_stderr\": 0.0352821125824523\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2864137086903305,\n\
\ \"mc1_stderr\": 0.01582614243950235,\n \"mc2\": 0.4378491944225627,\n\
\ \"mc2_stderr\": 0.014339426796993134\n }\n}\n```"
repo_url: https://huggingface.co/PygmalionAI/pygmalion-2-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_08T20_22_41.887829
path:
- '**/details_harness|arc:challenge|25_2023-10-08T20-22-41.887829.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-08T20-22-41.887829.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_08T20_22_41.887829
path:
- '**/details_harness|hellaswag|10_2023-10-08T20-22-41.887829.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-08T20-22-41.887829.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_08T20_22_41.887829
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T20-22-41.887829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T20-22-41.887829.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T20-22-41.887829.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_08T20_22_41.887829
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T20-22-41.887829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T20-22-41.887829.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_08T20_22_41.887829
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T20-22-41.887829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T20-22-41.887829.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_08T20_22_41.887829
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T20-22-41.887829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T20-22-41.887829.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_08T20_22_41.887829
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T20-22-41.887829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T20-22-41.887829.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_08T20_22_41.887829
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T20-22-41.887829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T20-22-41.887829.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_08T20_22_41.887829
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T20-22-41.887829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T20-22-41.887829.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_08T20_22_41.887829
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T20-22-41.887829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T20-22-41.887829.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_08T20_22_41.887829
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T20-22-41.887829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T20-22-41.887829.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_08T20_22_41.887829
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T20-22-41.887829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T20-22-41.887829.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_08T20_22_41.887829
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T20-22-41.887829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T20-22-41.887829.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_08T20_22_41.887829
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T20-22-41.887829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T20-22-41.887829.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_08T20_22_41.887829
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T20-22-41.887829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T20-22-41.887829.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_08T20_22_41.887829
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T20-22-41.887829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T20-22-41.887829.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_08T20_22_41.887829
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T20-22-41.887829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T20-22-41.887829.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_08T20_22_41.887829
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T20-22-41.887829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T20-22-41.887829.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_08T20_22_41.887829
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T20-22-41.887829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T20-22-41.887829.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_08T20_22_41.887829
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T20-22-41.887829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T20-22-41.887829.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_08T20_22_41.887829
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T20-22-41.887829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T20-22-41.887829.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_08T20_22_41.887829
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T20-22-41.887829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T20-22-41.887829.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_08T20_22_41.887829
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T20-22-41.887829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T20-22-41.887829.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_08T20_22_41.887829
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T20-22-41.887829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T20-22-41.887829.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_08T20_22_41.887829
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T20-22-41.887829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T20-22-41.887829.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_08T20_22_41.887829
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T20-22-41.887829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T20-22-41.887829.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_08T20_22_41.887829
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T20-22-41.887829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T20-22-41.887829.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_08T20_22_41.887829
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T20-22-41.887829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T20-22-41.887829.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_08T20_22_41.887829
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T20-22-41.887829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T20-22-41.887829.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_08T20_22_41.887829
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T20-22-41.887829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T20-22-41.887829.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_08T20_22_41.887829
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T20-22-41.887829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T20-22-41.887829.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_08T20_22_41.887829
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T20-22-41.887829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T20-22-41.887829.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_08T20_22_41.887829
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T20-22-41.887829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T20-22-41.887829.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_08T20_22_41.887829
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T20-22-41.887829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T20-22-41.887829.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_08T20_22_41.887829
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T20-22-41.887829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T20-22-41.887829.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_08T20_22_41.887829
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T20-22-41.887829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T20-22-41.887829.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_08T20_22_41.887829
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T20-22-41.887829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T20-22-41.887829.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_08T20_22_41.887829
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T20-22-41.887829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T20-22-41.887829.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_08T20_22_41.887829
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T20-22-41.887829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T20-22-41.887829.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_08T20_22_41.887829
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T20-22-41.887829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T20-22-41.887829.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_08T20_22_41.887829
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T20-22-41.887829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T20-22-41.887829.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_08T20_22_41.887829
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-08T20-22-41.887829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-08T20-22-41.887829.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_08T20_22_41.887829
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T20-22-41.887829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T20-22-41.887829.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_08T20_22_41.887829
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T20-22-41.887829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T20-22-41.887829.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_08T20_22_41.887829
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T20-22-41.887829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T20-22-41.887829.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_08T20_22_41.887829
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T20-22-41.887829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T20-22-41.887829.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_08T20_22_41.887829
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T20-22-41.887829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T20-22-41.887829.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_08T20_22_41.887829
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T20-22-41.887829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T20-22-41.887829.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_08T20_22_41.887829
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T20-22-41.887829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T20-22-41.887829.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_08T20_22_41.887829
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T20-22-41.887829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T20-22-41.887829.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_08T20_22_41.887829
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T20-22-41.887829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T20-22-41.887829.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_08T20_22_41.887829
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T20-22-41.887829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T20-22-41.887829.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_08T20_22_41.887829
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T20-22-41.887829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T20-22-41.887829.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_08T20_22_41.887829
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T20-22-41.887829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T20-22-41.887829.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_08T20_22_41.887829
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T20-22-41.887829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T20-22-41.887829.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_08T20_22_41.887829
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T20-22-41.887829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T20-22-41.887829.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_08T20_22_41.887829
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T20-22-41.887829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T20-22-41.887829.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_08T20_22_41.887829
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T20-22-41.887829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T20-22-41.887829.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_08T20_22_41.887829
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T20-22-41.887829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T20-22-41.887829.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_08T20_22_41.887829
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T20-22-41.887829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T20-22-41.887829.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_08T20_22_41.887829
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-08T20-22-41.887829.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-08T20-22-41.887829.parquet'
- config_name: results
data_files:
- split: 2023_10_08T20_22_41.887829
path:
- results_2023-10-08T20-22-41.887829.parquet
- split: latest
path:
- results_2023-10-08T20-22-41.887829.parquet
---
# Dataset Card for Evaluation run of PygmalionAI/pygmalion-2-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/PygmalionAI/pygmalion-2-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [PygmalionAI/pygmalion-2-7b](https://huggingface.co/PygmalionAI/pygmalion-2-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_PygmalionAI__pygmalion-2-7b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-08T20:22:41.887829](https://huggingface.co/datasets/open-llm-leaderboard/details_PygmalionAI__pygmalion-2-7b/blob/main/results_2023-10-08T20-22-41.887829.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.49287508581343986,
"acc_stderr": 0.03498236330341224,
"acc_norm": 0.4968617374915938,
"acc_norm_stderr": 0.03496801839069801,
"mc1": 0.2864137086903305,
"mc1_stderr": 0.01582614243950235,
"mc2": 0.4378491944225627,
"mc2_stderr": 0.014339426796993134
},
"harness|arc:challenge|25": {
"acc": 0.5025597269624573,
"acc_stderr": 0.01461119932984378,
"acc_norm": 0.5401023890784983,
"acc_norm_stderr": 0.01456431885692485
},
"harness|hellaswag|10": {
"acc": 0.5846444931288588,
"acc_stderr": 0.004917761181740162,
"acc_norm": 0.7823142800238996,
"acc_norm_stderr": 0.004118291804519085
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5259259259259259,
"acc_stderr": 0.04313531696750575,
"acc_norm": 0.5259259259259259,
"acc_norm_stderr": 0.04313531696750575
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4144736842105263,
"acc_stderr": 0.04008973785779206,
"acc_norm": 0.4144736842105263,
"acc_norm_stderr": 0.04008973785779206
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4981132075471698,
"acc_stderr": 0.030772653642075664,
"acc_norm": 0.4981132075471698,
"acc_norm_stderr": 0.030772653642075664
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4930555555555556,
"acc_stderr": 0.04180806750294938,
"acc_norm": 0.4930555555555556,
"acc_norm_stderr": 0.04180806750294938
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4161849710982659,
"acc_stderr": 0.03758517775404947,
"acc_norm": 0.4161849710982659,
"acc_norm_stderr": 0.03758517775404947
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.13725490196078433,
"acc_stderr": 0.03424084669891521,
"acc_norm": 0.13725490196078433,
"acc_norm_stderr": 0.03424084669891521
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.425531914893617,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.425531914893617,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.042270544512322004,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.042270544512322004
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.04164188720169375,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.04164188720169375
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30687830687830686,
"acc_stderr": 0.023752928712112143,
"acc_norm": 0.30687830687830686,
"acc_norm_stderr": 0.023752928712112143
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.040406101782088394,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.040406101782088394
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5419354838709678,
"acc_stderr": 0.028343787250540618,
"acc_norm": 0.5419354838709678,
"acc_norm_stderr": 0.028343787250540618
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3448275862068966,
"acc_stderr": 0.03344283744280459,
"acc_norm": 0.3448275862068966,
"acc_norm_stderr": 0.03344283744280459
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6303030303030303,
"acc_stderr": 0.03769430314512567,
"acc_norm": 0.6303030303030303,
"acc_norm_stderr": 0.03769430314512567
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6060606060606061,
"acc_stderr": 0.03481285338232963,
"acc_norm": 0.6060606060606061,
"acc_norm_stderr": 0.03481285338232963
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7202072538860104,
"acc_stderr": 0.032396370467357036,
"acc_norm": 0.7202072538860104,
"acc_norm_stderr": 0.032396370467357036
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.45384615384615384,
"acc_stderr": 0.025242770987126177,
"acc_norm": 0.45384615384615384,
"acc_norm_stderr": 0.025242770987126177
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.026719240783712166,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.026719240783712166
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4369747899159664,
"acc_stderr": 0.03221943636566196,
"acc_norm": 0.4369747899159664,
"acc_norm_stderr": 0.03221943636566196
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943342,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943342
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6587155963302752,
"acc_stderr": 0.020328612816592446,
"acc_norm": 0.6587155963302752,
"acc_norm_stderr": 0.020328612816592446
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.38425925925925924,
"acc_stderr": 0.03317354514310742,
"acc_norm": 0.38425925925925924,
"acc_norm_stderr": 0.03317354514310742
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.0332057461294543,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.0332057461294543
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7172995780590717,
"acc_stderr": 0.029312814153955924,
"acc_norm": 0.7172995780590717,
"acc_norm_stderr": 0.029312814153955924
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5560538116591929,
"acc_stderr": 0.03334625674242728,
"acc_norm": 0.5560538116591929,
"acc_norm_stderr": 0.03334625674242728
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6030534351145038,
"acc_stderr": 0.04291135671009224,
"acc_norm": 0.6030534351145038,
"acc_norm_stderr": 0.04291135671009224
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6115702479338843,
"acc_stderr": 0.04449270350068382,
"acc_norm": 0.6115702479338843,
"acc_norm_stderr": 0.04449270350068382
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.04812917324536823,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.04812917324536823
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5644171779141104,
"acc_stderr": 0.03895632464138936,
"acc_norm": 0.5644171779141104,
"acc_norm_stderr": 0.03895632464138936
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.6310679611650486,
"acc_stderr": 0.0477761518115674,
"acc_norm": 0.6310679611650486,
"acc_norm_stderr": 0.0477761518115674
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7264957264957265,
"acc_stderr": 0.02920254015343117,
"acc_norm": 0.7264957264957265,
"acc_norm_stderr": 0.02920254015343117
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6832694763729247,
"acc_stderr": 0.016635566427712568,
"acc_norm": 0.6832694763729247,
"acc_norm_stderr": 0.016635566427712568
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5289017341040463,
"acc_stderr": 0.026874085883518348,
"acc_norm": 0.5289017341040463,
"acc_norm_stderr": 0.026874085883518348
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2536312849162011,
"acc_stderr": 0.014551553659369916,
"acc_norm": 0.2536312849162011,
"acc_norm_stderr": 0.014551553659369916
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5490196078431373,
"acc_stderr": 0.028491993586171566,
"acc_norm": 0.5490196078431373,
"acc_norm_stderr": 0.028491993586171566
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.617363344051447,
"acc_stderr": 0.02760468902858199,
"acc_norm": 0.617363344051447,
"acc_norm_stderr": 0.02760468902858199
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.02780165621232366,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.02780165621232366
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3900709219858156,
"acc_stderr": 0.029097675599463926,
"acc_norm": 0.3900709219858156,
"acc_norm_stderr": 0.029097675599463926
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3670143415906128,
"acc_stderr": 0.012310264244842132,
"acc_norm": 0.3670143415906128,
"acc_norm_stderr": 0.012310264244842132
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5294117647058824,
"acc_stderr": 0.030320243265004137,
"acc_norm": 0.5294117647058824,
"acc_norm_stderr": 0.030320243265004137
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.46405228758169936,
"acc_stderr": 0.020175488765484043,
"acc_norm": 0.46405228758169936,
"acc_norm_stderr": 0.020175488765484043
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5363636363636364,
"acc_stderr": 0.04776449162396197,
"acc_norm": 0.5363636363636364,
"acc_norm_stderr": 0.04776449162396197
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5836734693877551,
"acc_stderr": 0.03155782816556165,
"acc_norm": 0.5836734693877551,
"acc_norm_stderr": 0.03155782816556165
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6915422885572139,
"acc_stderr": 0.03265819588512699,
"acc_norm": 0.6915422885572139,
"acc_norm_stderr": 0.03265819588512699
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-virology|5": {
"acc": 0.40963855421686746,
"acc_stderr": 0.03828401115079022,
"acc_norm": 0.40963855421686746,
"acc_norm_stderr": 0.03828401115079022
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.695906432748538,
"acc_stderr": 0.0352821125824523,
"acc_norm": 0.695906432748538,
"acc_norm_stderr": 0.0352821125824523
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2864137086903305,
"mc1_stderr": 0.01582614243950235,
"mc2": 0.4378491944225627,
"mc2_stderr": 0.014339426796993134
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
gayanin/test111q1 | 2023-10-08T20:24:58.000Z | [
"region:us"
] | gayanin | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: text
dtype: string
- name: masked_sent
dtype: string
splits:
- name: train
num_bytes: 3368
num_examples: 16
- name: test
num_bytes: 572
num_examples: 2
- name: validation
num_bytes: 758
num_examples: 2
download_size: 11740
dataset_size: 4698
---
# Dataset Card for "test111q1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_zarakiquemparte__zarablend-1.1-l2-7b | 2023-10-08T20:36:02.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of zarakiquemparte/zarablend-1.1-l2-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [zarakiquemparte/zarablend-1.1-l2-7b](https://huggingface.co/zarakiquemparte/zarablend-1.1-l2-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_zarakiquemparte__zarablend-1.1-l2-7b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-08T20:34:38.320909](https://huggingface.co/datasets/open-llm-leaderboard/details_zarakiquemparte__zarablend-1.1-l2-7b/blob/main/results_2023-10-08T20-34-38.320909.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.48174550452648596,\n\
\ \"acc_stderr\": 0.03530933614364666,\n \"acc_norm\": 0.485307857965695,\n\
\ \"acc_norm_stderr\": 0.03529500868775102,\n \"mc1\": 0.3317013463892289,\n\
\ \"mc1_stderr\": 0.016482148810241473,\n \"mc2\": 0.489972097170472,\n\
\ \"mc2_stderr\": 0.01566514218716543\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5221843003412969,\n \"acc_stderr\": 0.014597001927076136,\n\
\ \"acc_norm\": 0.5486348122866894,\n \"acc_norm_stderr\": 0.014542104569955267\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6020713005377415,\n\
\ \"acc_stderr\": 0.004884702412456089,\n \"acc_norm\": 0.7857996415056762,\n\
\ \"acc_norm_stderr\": 0.004094279871733677\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5185185185185185,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.5185185185185185,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3881578947368421,\n \"acc_stderr\": 0.03965842097512744,\n\
\ \"acc_norm\": 0.3881578947368421,\n \"acc_norm_stderr\": 0.03965842097512744\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.48,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5094339622641509,\n \"acc_stderr\": 0.030767394707808093,\n\
\ \"acc_norm\": 0.5094339622641509,\n \"acc_norm_stderr\": 0.030767394707808093\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04181210050035455,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04181210050035455\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.39,\n\
\ \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.44508670520231214,\n\
\ \"acc_stderr\": 0.03789401760283647,\n \"acc_norm\": 0.44508670520231214,\n\
\ \"acc_norm_stderr\": 0.03789401760283647\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.04389869956808778,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.04389869956808778\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.58,\n \"acc_stderr\": 0.04960449637488583,\n \"acc_norm\": 0.58,\n\
\ \"acc_norm_stderr\": 0.04960449637488583\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.42127659574468085,\n \"acc_stderr\": 0.03227834510146267,\n\
\ \"acc_norm\": 0.42127659574468085,\n \"acc_norm_stderr\": 0.03227834510146267\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.45517241379310347,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.45517241379310347,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2962962962962963,\n \"acc_stderr\": 0.02351729433596328,\n \"\
acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.02351729433596328\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2619047619047619,\n\
\ \"acc_stderr\": 0.0393253768039287,\n \"acc_norm\": 0.2619047619047619,\n\
\ \"acc_norm_stderr\": 0.0393253768039287\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.5064516129032258,\n \"acc_stderr\": 0.02844163823354051,\n \"\
acc_norm\": 0.5064516129032258,\n \"acc_norm_stderr\": 0.02844163823354051\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.3694581280788177,\n \"acc_stderr\": 0.03395970381998573,\n \"\
acc_norm\": 0.3694581280788177,\n \"acc_norm_stderr\": 0.03395970381998573\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03825460278380026,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03825460278380026\n },\n\
\ \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.5757575757575758,\n\
\ \"acc_stderr\": 0.03521224908841585,\n \"acc_norm\": 0.5757575757575758,\n\
\ \"acc_norm_stderr\": 0.03521224908841585\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\"\
: {\n \"acc\": 0.694300518134715,\n \"acc_stderr\": 0.033248379397581594,\n\
\ \"acc_norm\": 0.694300518134715,\n \"acc_norm_stderr\": 0.033248379397581594\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.45384615384615384,\n \"acc_stderr\": 0.025242770987126177,\n\
\ \"acc_norm\": 0.45384615384615384,\n \"acc_norm_stderr\": 0.025242770987126177\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25555555555555554,\n \"acc_stderr\": 0.026593939101844072,\n \
\ \"acc_norm\": 0.25555555555555554,\n \"acc_norm_stderr\": 0.026593939101844072\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.4369747899159664,\n \"acc_stderr\": 0.03221943636566196,\n \
\ \"acc_norm\": 0.4369747899159664,\n \"acc_norm_stderr\": 0.03221943636566196\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763743,\n \"\
acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763743\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6660550458715596,\n \"acc_stderr\": 0.020220554196736407,\n \"\
acc_norm\": 0.6660550458715596,\n \"acc_norm_stderr\": 0.020220554196736407\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3101851851851852,\n \"acc_stderr\": 0.0315469628565663,\n \"acc_norm\"\
: 0.3101851851851852,\n \"acc_norm_stderr\": 0.0315469628565663\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.6323529411764706,\n\
\ \"acc_stderr\": 0.03384132045674119,\n \"acc_norm\": 0.6323529411764706,\n\
\ \"acc_norm_stderr\": 0.03384132045674119\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.6624472573839663,\n \"acc_stderr\": 0.03078154910202622,\n\
\ \"acc_norm\": 0.6624472573839663,\n \"acc_norm_stderr\": 0.03078154910202622\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5829596412556054,\n\
\ \"acc_stderr\": 0.03309266936071721,\n \"acc_norm\": 0.5829596412556054,\n\
\ \"acc_norm_stderr\": 0.03309266936071721\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.549618320610687,\n \"acc_stderr\": 0.04363643698524779,\n\
\ \"acc_norm\": 0.549618320610687,\n \"acc_norm_stderr\": 0.04363643698524779\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6446280991735537,\n \"acc_stderr\": 0.0436923632657398,\n \"acc_norm\"\
: 0.6446280991735537,\n \"acc_norm_stderr\": 0.0436923632657398\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5462962962962963,\n\
\ \"acc_stderr\": 0.04812917324536823,\n \"acc_norm\": 0.5462962962962963,\n\
\ \"acc_norm_stderr\": 0.04812917324536823\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.49693251533742333,\n \"acc_stderr\": 0.03928297078179663,\n\
\ \"acc_norm\": 0.49693251533742333,\n \"acc_norm_stderr\": 0.03928297078179663\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n\
\ \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \
\ \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5922330097087378,\n \"acc_stderr\": 0.048657775704107696,\n\
\ \"acc_norm\": 0.5922330097087378,\n \"acc_norm_stderr\": 0.048657775704107696\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7136752136752137,\n\
\ \"acc_stderr\": 0.029614323690456655,\n \"acc_norm\": 0.7136752136752137,\n\
\ \"acc_norm_stderr\": 0.029614323690456655\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6679438058748404,\n\
\ \"acc_stderr\": 0.016841174655295728,\n \"acc_norm\": 0.6679438058748404,\n\
\ \"acc_norm_stderr\": 0.016841174655295728\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5144508670520231,\n \"acc_stderr\": 0.02690784985628254,\n\
\ \"acc_norm\": 0.5144508670520231,\n \"acc_norm_stderr\": 0.02690784985628254\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2994413407821229,\n\
\ \"acc_stderr\": 0.015318257745976711,\n \"acc_norm\": 0.2994413407821229,\n\
\ \"acc_norm_stderr\": 0.015318257745976711\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.028629916715693413,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.028629916715693413\n \
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5787781350482315,\n\
\ \"acc_stderr\": 0.028043399858210624,\n \"acc_norm\": 0.5787781350482315,\n\
\ \"acc_norm_stderr\": 0.028043399858210624\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5277777777777778,\n \"acc_stderr\": 0.027777777777777797,\n\
\ \"acc_norm\": 0.5277777777777778,\n \"acc_norm_stderr\": 0.027777777777777797\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.35815602836879434,\n \"acc_stderr\": 0.02860208586275942,\n \
\ \"acc_norm\": 0.35815602836879434,\n \"acc_norm_stderr\": 0.02860208586275942\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3559322033898305,\n\
\ \"acc_stderr\": 0.012228645537277568,\n \"acc_norm\": 0.3559322033898305,\n\
\ \"acc_norm_stderr\": 0.012228645537277568\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5073529411764706,\n \"acc_stderr\": 0.030369552523902173,\n\
\ \"acc_norm\": 0.5073529411764706,\n \"acc_norm_stderr\": 0.030369552523902173\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.42810457516339867,\n \"acc_stderr\": 0.0200176292142131,\n \
\ \"acc_norm\": 0.42810457516339867,\n \"acc_norm_stderr\": 0.0200176292142131\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5727272727272728,\n\
\ \"acc_stderr\": 0.04738198703545483,\n \"acc_norm\": 0.5727272727272728,\n\
\ \"acc_norm_stderr\": 0.04738198703545483\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5551020408163265,\n \"acc_stderr\": 0.031814251181977865,\n\
\ \"acc_norm\": 0.5551020408163265,\n \"acc_norm_stderr\": 0.031814251181977865\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6119402985074627,\n\
\ \"acc_stderr\": 0.03445789964362749,\n \"acc_norm\": 0.6119402985074627,\n\
\ \"acc_norm_stderr\": 0.03445789964362749\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \
\ \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4397590361445783,\n\
\ \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.4397590361445783,\n\
\ \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6783625730994152,\n \"acc_stderr\": 0.03582529442573122,\n\
\ \"acc_norm\": 0.6783625730994152,\n \"acc_norm_stderr\": 0.03582529442573122\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3317013463892289,\n\
\ \"mc1_stderr\": 0.016482148810241473,\n \"mc2\": 0.489972097170472,\n\
\ \"mc2_stderr\": 0.01566514218716543\n }\n}\n```"
repo_url: https://huggingface.co/zarakiquemparte/zarablend-1.1-l2-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_08T20_34_38.320909
path:
- '**/details_harness|arc:challenge|25_2023-10-08T20-34-38.320909.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-08T20-34-38.320909.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_08T20_34_38.320909
path:
- '**/details_harness|hellaswag|10_2023-10-08T20-34-38.320909.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-08T20-34-38.320909.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_08T20_34_38.320909
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T20-34-38.320909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T20-34-38.320909.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T20-34-38.320909.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_08T20_34_38.320909
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T20-34-38.320909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T20-34-38.320909.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_08T20_34_38.320909
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T20-34-38.320909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T20-34-38.320909.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_08T20_34_38.320909
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T20-34-38.320909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T20-34-38.320909.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_08T20_34_38.320909
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T20-34-38.320909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T20-34-38.320909.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_08T20_34_38.320909
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T20-34-38.320909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T20-34-38.320909.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_08T20_34_38.320909
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T20-34-38.320909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T20-34-38.320909.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_08T20_34_38.320909
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T20-34-38.320909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T20-34-38.320909.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_08T20_34_38.320909
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T20-34-38.320909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T20-34-38.320909.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_08T20_34_38.320909
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T20-34-38.320909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T20-34-38.320909.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_08T20_34_38.320909
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T20-34-38.320909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T20-34-38.320909.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_08T20_34_38.320909
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T20-34-38.320909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T20-34-38.320909.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_08T20_34_38.320909
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T20-34-38.320909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T20-34-38.320909.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_08T20_34_38.320909
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T20-34-38.320909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T20-34-38.320909.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_08T20_34_38.320909
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T20-34-38.320909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T20-34-38.320909.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_08T20_34_38.320909
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T20-34-38.320909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T20-34-38.320909.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_08T20_34_38.320909
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T20-34-38.320909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T20-34-38.320909.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_08T20_34_38.320909
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T20-34-38.320909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T20-34-38.320909.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_08T20_34_38.320909
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T20-34-38.320909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T20-34-38.320909.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_08T20_34_38.320909
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T20-34-38.320909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T20-34-38.320909.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_08T20_34_38.320909
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T20-34-38.320909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T20-34-38.320909.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_08T20_34_38.320909
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T20-34-38.320909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T20-34-38.320909.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_08T20_34_38.320909
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T20-34-38.320909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T20-34-38.320909.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_08T20_34_38.320909
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T20-34-38.320909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T20-34-38.320909.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_08T20_34_38.320909
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T20-34-38.320909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T20-34-38.320909.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_08T20_34_38.320909
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T20-34-38.320909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T20-34-38.320909.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_08T20_34_38.320909
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T20-34-38.320909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T20-34-38.320909.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_08T20_34_38.320909
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T20-34-38.320909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T20-34-38.320909.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_08T20_34_38.320909
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T20-34-38.320909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T20-34-38.320909.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_08T20_34_38.320909
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T20-34-38.320909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T20-34-38.320909.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_08T20_34_38.320909
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T20-34-38.320909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T20-34-38.320909.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_08T20_34_38.320909
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T20-34-38.320909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T20-34-38.320909.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_08T20_34_38.320909
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T20-34-38.320909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T20-34-38.320909.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_08T20_34_38.320909
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T20-34-38.320909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T20-34-38.320909.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_08T20_34_38.320909
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T20-34-38.320909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T20-34-38.320909.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_08T20_34_38.320909
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T20-34-38.320909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T20-34-38.320909.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_08T20_34_38.320909
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T20-34-38.320909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T20-34-38.320909.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_08T20_34_38.320909
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T20-34-38.320909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T20-34-38.320909.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_08T20_34_38.320909
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T20-34-38.320909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T20-34-38.320909.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_08T20_34_38.320909
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-08T20-34-38.320909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-08T20-34-38.320909.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_08T20_34_38.320909
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T20-34-38.320909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T20-34-38.320909.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_08T20_34_38.320909
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T20-34-38.320909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T20-34-38.320909.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_08T20_34_38.320909
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T20-34-38.320909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T20-34-38.320909.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_08T20_34_38.320909
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T20-34-38.320909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T20-34-38.320909.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_08T20_34_38.320909
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T20-34-38.320909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T20-34-38.320909.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_08T20_34_38.320909
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T20-34-38.320909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T20-34-38.320909.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_08T20_34_38.320909
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T20-34-38.320909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T20-34-38.320909.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_08T20_34_38.320909
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T20-34-38.320909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T20-34-38.320909.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_08T20_34_38.320909
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T20-34-38.320909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T20-34-38.320909.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_08T20_34_38.320909
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T20-34-38.320909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T20-34-38.320909.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_08T20_34_38.320909
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T20-34-38.320909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T20-34-38.320909.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_08T20_34_38.320909
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T20-34-38.320909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T20-34-38.320909.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_08T20_34_38.320909
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T20-34-38.320909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T20-34-38.320909.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_08T20_34_38.320909
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T20-34-38.320909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T20-34-38.320909.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_08T20_34_38.320909
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T20-34-38.320909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T20-34-38.320909.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_08T20_34_38.320909
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T20-34-38.320909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T20-34-38.320909.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_08T20_34_38.320909
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T20-34-38.320909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T20-34-38.320909.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_08T20_34_38.320909
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T20-34-38.320909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T20-34-38.320909.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_08T20_34_38.320909
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-08T20-34-38.320909.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-08T20-34-38.320909.parquet'
- config_name: results
data_files:
- split: 2023_10_08T20_34_38.320909
path:
- results_2023-10-08T20-34-38.320909.parquet
- split: latest
path:
- results_2023-10-08T20-34-38.320909.parquet
---
# Dataset Card for Evaluation run of zarakiquemparte/zarablend-1.1-l2-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/zarakiquemparte/zarablend-1.1-l2-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [zarakiquemparte/zarablend-1.1-l2-7b](https://huggingface.co/zarakiquemparte/zarablend-1.1-l2-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_zarakiquemparte__zarablend-1.1-l2-7b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-08T20:34:38.320909](https://huggingface.co/datasets/open-llm-leaderboard/details_zarakiquemparte__zarablend-1.1-l2-7b/blob/main/results_2023-10-08T20-34-38.320909.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.48174550452648596,
"acc_stderr": 0.03530933614364666,
"acc_norm": 0.485307857965695,
"acc_norm_stderr": 0.03529500868775102,
"mc1": 0.3317013463892289,
"mc1_stderr": 0.016482148810241473,
"mc2": 0.489972097170472,
"mc2_stderr": 0.01566514218716543
},
"harness|arc:challenge|25": {
"acc": 0.5221843003412969,
"acc_stderr": 0.014597001927076136,
"acc_norm": 0.5486348122866894,
"acc_norm_stderr": 0.014542104569955267
},
"harness|hellaswag|10": {
"acc": 0.6020713005377415,
"acc_stderr": 0.004884702412456089,
"acc_norm": 0.7857996415056762,
"acc_norm_stderr": 0.004094279871733677
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3881578947368421,
"acc_stderr": 0.03965842097512744,
"acc_norm": 0.3881578947368421,
"acc_norm_stderr": 0.03965842097512744
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5094339622641509,
"acc_stderr": 0.030767394707808093,
"acc_norm": 0.5094339622641509,
"acc_norm_stderr": 0.030767394707808093
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5,
"acc_stderr": 0.04181210050035455,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04181210050035455
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.44508670520231214,
"acc_stderr": 0.03789401760283647,
"acc_norm": 0.44508670520231214,
"acc_norm_stderr": 0.03789401760283647
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.04389869956808778,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.04389869956808778
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.58,
"acc_stderr": 0.04960449637488583,
"acc_norm": 0.58,
"acc_norm_stderr": 0.04960449637488583
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.42127659574468085,
"acc_stderr": 0.03227834510146267,
"acc_norm": 0.42127659574468085,
"acc_norm_stderr": 0.03227834510146267
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159394,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159394
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.45517241379310347,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.45517241379310347,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.02351729433596328,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.02351729433596328
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.0393253768039287,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.0393253768039287
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5064516129032258,
"acc_stderr": 0.02844163823354051,
"acc_norm": 0.5064516129032258,
"acc_norm_stderr": 0.02844163823354051
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3694581280788177,
"acc_stderr": 0.03395970381998573,
"acc_norm": 0.3694581280788177,
"acc_norm_stderr": 0.03395970381998573
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6,
"acc_stderr": 0.03825460278380026,
"acc_norm": 0.6,
"acc_norm_stderr": 0.03825460278380026
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5757575757575758,
"acc_stderr": 0.03521224908841585,
"acc_norm": 0.5757575757575758,
"acc_norm_stderr": 0.03521224908841585
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.694300518134715,
"acc_stderr": 0.033248379397581594,
"acc_norm": 0.694300518134715,
"acc_norm_stderr": 0.033248379397581594
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.45384615384615384,
"acc_stderr": 0.025242770987126177,
"acc_norm": 0.45384615384615384,
"acc_norm_stderr": 0.025242770987126177
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25555555555555554,
"acc_stderr": 0.026593939101844072,
"acc_norm": 0.25555555555555554,
"acc_norm_stderr": 0.026593939101844072
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4369747899159664,
"acc_stderr": 0.03221943636566196,
"acc_norm": 0.4369747899159664,
"acc_norm_stderr": 0.03221943636566196
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2781456953642384,
"acc_stderr": 0.03658603262763743,
"acc_norm": 0.2781456953642384,
"acc_norm_stderr": 0.03658603262763743
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6660550458715596,
"acc_stderr": 0.020220554196736407,
"acc_norm": 0.6660550458715596,
"acc_norm_stderr": 0.020220554196736407
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3101851851851852,
"acc_stderr": 0.0315469628565663,
"acc_norm": 0.3101851851851852,
"acc_norm_stderr": 0.0315469628565663
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6323529411764706,
"acc_stderr": 0.03384132045674119,
"acc_norm": 0.6323529411764706,
"acc_norm_stderr": 0.03384132045674119
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6624472573839663,
"acc_stderr": 0.03078154910202622,
"acc_norm": 0.6624472573839663,
"acc_norm_stderr": 0.03078154910202622
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5829596412556054,
"acc_stderr": 0.03309266936071721,
"acc_norm": 0.5829596412556054,
"acc_norm_stderr": 0.03309266936071721
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.549618320610687,
"acc_stderr": 0.04363643698524779,
"acc_norm": 0.549618320610687,
"acc_norm_stderr": 0.04363643698524779
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6446280991735537,
"acc_stderr": 0.0436923632657398,
"acc_norm": 0.6446280991735537,
"acc_norm_stderr": 0.0436923632657398
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.04812917324536823,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.04812917324536823
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.49693251533742333,
"acc_stderr": 0.03928297078179663,
"acc_norm": 0.49693251533742333,
"acc_norm_stderr": 0.03928297078179663
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.375,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.375,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.5922330097087378,
"acc_stderr": 0.048657775704107696,
"acc_norm": 0.5922330097087378,
"acc_norm_stderr": 0.048657775704107696
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7136752136752137,
"acc_stderr": 0.029614323690456655,
"acc_norm": 0.7136752136752137,
"acc_norm_stderr": 0.029614323690456655
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6679438058748404,
"acc_stderr": 0.016841174655295728,
"acc_norm": 0.6679438058748404,
"acc_norm_stderr": 0.016841174655295728
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5144508670520231,
"acc_stderr": 0.02690784985628254,
"acc_norm": 0.5144508670520231,
"acc_norm_stderr": 0.02690784985628254
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2994413407821229,
"acc_stderr": 0.015318257745976711,
"acc_norm": 0.2994413407821229,
"acc_norm_stderr": 0.015318257745976711
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5,
"acc_stderr": 0.028629916715693413,
"acc_norm": 0.5,
"acc_norm_stderr": 0.028629916715693413
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5787781350482315,
"acc_stderr": 0.028043399858210624,
"acc_norm": 0.5787781350482315,
"acc_norm_stderr": 0.028043399858210624
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.027777777777777797,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.027777777777777797
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.35815602836879434,
"acc_stderr": 0.02860208586275942,
"acc_norm": 0.35815602836879434,
"acc_norm_stderr": 0.02860208586275942
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3559322033898305,
"acc_stderr": 0.012228645537277568,
"acc_norm": 0.3559322033898305,
"acc_norm_stderr": 0.012228645537277568
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5073529411764706,
"acc_stderr": 0.030369552523902173,
"acc_norm": 0.5073529411764706,
"acc_norm_stderr": 0.030369552523902173
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.42810457516339867,
"acc_stderr": 0.0200176292142131,
"acc_norm": 0.42810457516339867,
"acc_norm_stderr": 0.0200176292142131
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5727272727272728,
"acc_stderr": 0.04738198703545483,
"acc_norm": 0.5727272727272728,
"acc_norm_stderr": 0.04738198703545483
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5551020408163265,
"acc_stderr": 0.031814251181977865,
"acc_norm": 0.5551020408163265,
"acc_norm_stderr": 0.031814251181977865
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6119402985074627,
"acc_stderr": 0.03445789964362749,
"acc_norm": 0.6119402985074627,
"acc_norm_stderr": 0.03445789964362749
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4397590361445783,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.4397590361445783,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6783625730994152,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.6783625730994152,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3317013463892289,
"mc1_stderr": 0.016482148810241473,
"mc2": 0.489972097170472,
"mc2_stderr": 0.01566514218716543
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
yangwang825/20newsgroups | 2023-10-08T20:39:23.000Z | [
"region:us"
] | yangwang825 | null | null | null | 0 | 0 | Entry not found |
unaidedelf87777/the-da-vinci-code | 2023-10-08T20:39:40.000Z | [
"region:us"
] | unaidedelf87777 | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_uukuguy__speechless-codellama-34b-v1.9 | 2023-10-08T20:46:20.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of uukuguy/speechless-codellama-34b-v1.9
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [uukuguy/speechless-codellama-34b-v1.9](https://huggingface.co/uukuguy/speechless-codellama-34b-v1.9)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_uukuguy__speechless-codellama-34b-v1.9\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-08T20:44:59.061253](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-codellama-34b-v1.9/blob/main/results_2023-10-08T20-44-59.061253.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5601866295865682,\n\
\ \"acc_stderr\": 0.03493787360526325,\n \"acc_norm\": 0.5641669907991567,\n\
\ \"acc_norm_stderr\": 0.03492601491560402,\n \"mc1\": 0.2974296205630355,\n\
\ \"mc1_stderr\": 0.016002651487361002,\n \"mc2\": 0.43920006507019715,\n\
\ \"mc2_stderr\": 0.014754612740891303\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5017064846416383,\n \"acc_stderr\": 0.01461130570505698,\n\
\ \"acc_norm\": 0.5426621160409556,\n \"acc_norm_stderr\": 0.01455810654392406\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5581557458673571,\n\
\ \"acc_stderr\": 0.004955914693717968,\n \"acc_norm\": 0.7520414260107549,\n\
\ \"acc_norm_stderr\": 0.004309451164956192\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411021,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411021\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4222222222222222,\n\
\ \"acc_stderr\": 0.042667634040995814,\n \"acc_norm\": 0.4222222222222222,\n\
\ \"acc_norm_stderr\": 0.042667634040995814\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6052631578947368,\n \"acc_stderr\": 0.039777499346220734,\n\
\ \"acc_norm\": 0.6052631578947368,\n \"acc_norm_stderr\": 0.039777499346220734\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5132075471698113,\n \"acc_stderr\": 0.030762134874500476,\n\
\ \"acc_norm\": 0.5132075471698113,\n \"acc_norm_stderr\": 0.030762134874500476\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5347222222222222,\n\
\ \"acc_stderr\": 0.04171115858181618,\n \"acc_norm\": 0.5347222222222222,\n\
\ \"acc_norm_stderr\": 0.04171115858181618\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n\
\ \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4624277456647399,\n\
\ \"acc_stderr\": 0.0380168510452446,\n \"acc_norm\": 0.4624277456647399,\n\
\ \"acc_norm_stderr\": 0.0380168510452446\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.047551296160629475,\n\
\ \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.047551296160629475\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.68,\n\
\ \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4851063829787234,\n \"acc_stderr\": 0.032671518489247764,\n\
\ \"acc_norm\": 0.4851063829787234,\n \"acc_norm_stderr\": 0.032671518489247764\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n\
\ \"acc_stderr\": 0.046570472605949625,\n \"acc_norm\": 0.4298245614035088,\n\
\ \"acc_norm_stderr\": 0.046570472605949625\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41534391534391535,\n \"acc_stderr\": 0.02537952491077839,\n \"\
acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.02537952491077839\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6451612903225806,\n\
\ \"acc_stderr\": 0.027218889773308767,\n \"acc_norm\": 0.6451612903225806,\n\
\ \"acc_norm_stderr\": 0.027218889773308767\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.46798029556650245,\n \"acc_stderr\": 0.03510766597959217,\n\
\ \"acc_norm\": 0.46798029556650245,\n \"acc_norm_stderr\": 0.03510766597959217\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.73,\n \"acc_stderr\": 0.04461960433384739,\n \"acc_norm\"\
: 0.73,\n \"acc_norm_stderr\": 0.04461960433384739\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.03546563019624336,\n\
\ \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.03546563019624336\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6666666666666666,\n \"acc_stderr\": 0.033586181457325205,\n \"\
acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.033586181457325205\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7305699481865285,\n \"acc_stderr\": 0.032018671228777947,\n\
\ \"acc_norm\": 0.7305699481865285,\n \"acc_norm_stderr\": 0.032018671228777947\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5205128205128206,\n \"acc_stderr\": 0.02532966316348994,\n \
\ \"acc_norm\": 0.5205128205128206,\n \"acc_norm_stderr\": 0.02532966316348994\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.37407407407407406,\n \"acc_stderr\": 0.029502861128955293,\n \
\ \"acc_norm\": 0.37407407407407406,\n \"acc_norm_stderr\": 0.029502861128955293\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5462184873949579,\n \"acc_stderr\": 0.03233943468182088,\n \
\ \"acc_norm\": 0.5462184873949579,\n \"acc_norm_stderr\": 0.03233943468182088\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2913907284768212,\n \"acc_stderr\": 0.03710185726119995,\n \"\
acc_norm\": 0.2913907284768212,\n \"acc_norm_stderr\": 0.03710185726119995\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7394495412844037,\n \"acc_stderr\": 0.01881918203485007,\n \"\
acc_norm\": 0.7394495412844037,\n \"acc_norm_stderr\": 0.01881918203485007\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.42592592592592593,\n \"acc_stderr\": 0.03372343271653064,\n \"\
acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.03372343271653064\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7254901960784313,\n \"acc_stderr\": 0.0313217980308329,\n \"acc_norm\"\
: 0.7254901960784313,\n \"acc_norm_stderr\": 0.0313217980308329\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.7426160337552743,\n \"acc_stderr\": 0.028458820991460302,\n \"\
acc_norm\": 0.7426160337552743,\n \"acc_norm_stderr\": 0.028458820991460302\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5874439461883408,\n\
\ \"acc_stderr\": 0.03304062175449297,\n \"acc_norm\": 0.5874439461883408,\n\
\ \"acc_norm_stderr\": 0.03304062175449297\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5648854961832062,\n \"acc_stderr\": 0.04348208051644858,\n\
\ \"acc_norm\": 0.5648854961832062,\n \"acc_norm_stderr\": 0.04348208051644858\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7024793388429752,\n \"acc_stderr\": 0.04173349148083499,\n \"\
acc_norm\": 0.7024793388429752,\n \"acc_norm_stderr\": 0.04173349148083499\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7037037037037037,\n\
\ \"acc_stderr\": 0.044143436668549335,\n \"acc_norm\": 0.7037037037037037,\n\
\ \"acc_norm_stderr\": 0.044143436668549335\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6809815950920245,\n \"acc_stderr\": 0.03661997551073836,\n\
\ \"acc_norm\": 0.6809815950920245,\n \"acc_norm_stderr\": 0.03661997551073836\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6893203883495146,\n \"acc_stderr\": 0.0458212416016155,\n\
\ \"acc_norm\": 0.6893203883495146,\n \"acc_norm_stderr\": 0.0458212416016155\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7991452991452992,\n\
\ \"acc_stderr\": 0.02624677294689048,\n \"acc_norm\": 0.7991452991452992,\n\
\ \"acc_norm_stderr\": 0.02624677294689048\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7177522349936143,\n\
\ \"acc_stderr\": 0.016095302969878544,\n \"acc_norm\": 0.7177522349936143,\n\
\ \"acc_norm_stderr\": 0.016095302969878544\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6242774566473989,\n \"acc_stderr\": 0.026074314851657083,\n\
\ \"acc_norm\": 0.6242774566473989,\n \"acc_norm_stderr\": 0.026074314851657083\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4011173184357542,\n\
\ \"acc_stderr\": 0.01639222189940708,\n \"acc_norm\": 0.4011173184357542,\n\
\ \"acc_norm_stderr\": 0.01639222189940708\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.02845263998508801,\n\
\ \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.02845263998508801\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6430868167202572,\n\
\ \"acc_stderr\": 0.027210420375934012,\n \"acc_norm\": 0.6430868167202572,\n\
\ \"acc_norm_stderr\": 0.027210420375934012\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6049382716049383,\n \"acc_stderr\": 0.027201117666925657,\n\
\ \"acc_norm\": 0.6049382716049383,\n \"acc_norm_stderr\": 0.027201117666925657\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4148936170212766,\n \"acc_stderr\": 0.0293922365846125,\n \
\ \"acc_norm\": 0.4148936170212766,\n \"acc_norm_stderr\": 0.0293922365846125\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4367666232073012,\n\
\ \"acc_stderr\": 0.01266770191960366,\n \"acc_norm\": 0.4367666232073012,\n\
\ \"acc_norm_stderr\": 0.01266770191960366\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4485294117647059,\n \"acc_stderr\": 0.030211479609121603,\n\
\ \"acc_norm\": 0.4485294117647059,\n \"acc_norm_stderr\": 0.030211479609121603\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5277777777777778,\n \"acc_stderr\": 0.020196594933541197,\n \
\ \"acc_norm\": 0.5277777777777778,\n \"acc_norm_stderr\": 0.020196594933541197\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6938775510204082,\n \"acc_stderr\": 0.029504896454595957,\n\
\ \"acc_norm\": 0.6938775510204082,\n \"acc_norm_stderr\": 0.029504896454595957\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7412935323383084,\n\
\ \"acc_stderr\": 0.030965903123573044,\n \"acc_norm\": 0.7412935323383084,\n\
\ \"acc_norm_stderr\": 0.030965903123573044\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \
\ \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.40963855421686746,\n\
\ \"acc_stderr\": 0.03828401115079021,\n \"acc_norm\": 0.40963855421686746,\n\
\ \"acc_norm_stderr\": 0.03828401115079021\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7309941520467836,\n \"acc_stderr\": 0.0340105262010409,\n\
\ \"acc_norm\": 0.7309941520467836,\n \"acc_norm_stderr\": 0.0340105262010409\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2974296205630355,\n\
\ \"mc1_stderr\": 0.016002651487361002,\n \"mc2\": 0.43920006507019715,\n\
\ \"mc2_stderr\": 0.014754612740891303\n }\n}\n```"
repo_url: https://huggingface.co/uukuguy/speechless-codellama-34b-v1.9
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|arc:challenge|25_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hellaswag|10_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-08T20-44-59.061253.parquet'
- config_name: results
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- results_2023-10-08T20-44-59.061253.parquet
- split: latest
path:
- results_2023-10-08T20-44-59.061253.parquet
---
# Dataset Card for Evaluation run of uukuguy/speechless-codellama-34b-v1.9
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/uukuguy/speechless-codellama-34b-v1.9
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [uukuguy/speechless-codellama-34b-v1.9](https://huggingface.co/uukuguy/speechless-codellama-34b-v1.9) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_uukuguy__speechless-codellama-34b-v1.9",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-08T20:44:59.061253](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-codellama-34b-v1.9/blob/main/results_2023-10-08T20-44-59.061253.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5601866295865682,
"acc_stderr": 0.03493787360526325,
"acc_norm": 0.5641669907991567,
"acc_norm_stderr": 0.03492601491560402,
"mc1": 0.2974296205630355,
"mc1_stderr": 0.016002651487361002,
"mc2": 0.43920006507019715,
"mc2_stderr": 0.014754612740891303
},
"harness|arc:challenge|25": {
"acc": 0.5017064846416383,
"acc_stderr": 0.01461130570505698,
"acc_norm": 0.5426621160409556,
"acc_norm_stderr": 0.01455810654392406
},
"harness|hellaswag|10": {
"acc": 0.5581557458673571,
"acc_stderr": 0.004955914693717968,
"acc_norm": 0.7520414260107549,
"acc_norm_stderr": 0.004309451164956192
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411021,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411021
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4222222222222222,
"acc_stderr": 0.042667634040995814,
"acc_norm": 0.4222222222222222,
"acc_norm_stderr": 0.042667634040995814
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6052631578947368,
"acc_stderr": 0.039777499346220734,
"acc_norm": 0.6052631578947368,
"acc_norm_stderr": 0.039777499346220734
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5132075471698113,
"acc_stderr": 0.030762134874500476,
"acc_norm": 0.5132075471698113,
"acc_norm_stderr": 0.030762134874500476
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5347222222222222,
"acc_stderr": 0.04171115858181618,
"acc_norm": 0.5347222222222222,
"acc_norm_stderr": 0.04171115858181618
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4624277456647399,
"acc_stderr": 0.0380168510452446,
"acc_norm": 0.4624277456647399,
"acc_norm_stderr": 0.0380168510452446
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.047551296160629475,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.047551296160629475
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4851063829787234,
"acc_stderr": 0.032671518489247764,
"acc_norm": 0.4851063829787234,
"acc_norm_stderr": 0.032671518489247764
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4298245614035088,
"acc_stderr": 0.046570472605949625,
"acc_norm": 0.4298245614035088,
"acc_norm_stderr": 0.046570472605949625
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.02537952491077839,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.02537952491077839
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6451612903225806,
"acc_stderr": 0.027218889773308767,
"acc_norm": 0.6451612903225806,
"acc_norm_stderr": 0.027218889773308767
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.46798029556650245,
"acc_stderr": 0.03510766597959217,
"acc_norm": 0.46798029556650245,
"acc_norm_stderr": 0.03510766597959217
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.03546563019624336,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.03546563019624336
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.033586181457325205,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.033586181457325205
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7305699481865285,
"acc_stderr": 0.032018671228777947,
"acc_norm": 0.7305699481865285,
"acc_norm_stderr": 0.032018671228777947
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5205128205128206,
"acc_stderr": 0.02532966316348994,
"acc_norm": 0.5205128205128206,
"acc_norm_stderr": 0.02532966316348994
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37407407407407406,
"acc_stderr": 0.029502861128955293,
"acc_norm": 0.37407407407407406,
"acc_norm_stderr": 0.029502861128955293
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5462184873949579,
"acc_stderr": 0.03233943468182088,
"acc_norm": 0.5462184873949579,
"acc_norm_stderr": 0.03233943468182088
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2913907284768212,
"acc_stderr": 0.03710185726119995,
"acc_norm": 0.2913907284768212,
"acc_norm_stderr": 0.03710185726119995
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7394495412844037,
"acc_stderr": 0.01881918203485007,
"acc_norm": 0.7394495412844037,
"acc_norm_stderr": 0.01881918203485007
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.03372343271653064,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.03372343271653064
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.0313217980308329,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.0313217980308329
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7426160337552743,
"acc_stderr": 0.028458820991460302,
"acc_norm": 0.7426160337552743,
"acc_norm_stderr": 0.028458820991460302
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5874439461883408,
"acc_stderr": 0.03304062175449297,
"acc_norm": 0.5874439461883408,
"acc_norm_stderr": 0.03304062175449297
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5648854961832062,
"acc_stderr": 0.04348208051644858,
"acc_norm": 0.5648854961832062,
"acc_norm_stderr": 0.04348208051644858
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7024793388429752,
"acc_stderr": 0.04173349148083499,
"acc_norm": 0.7024793388429752,
"acc_norm_stderr": 0.04173349148083499
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.044143436668549335,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.044143436668549335
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6809815950920245,
"acc_stderr": 0.03661997551073836,
"acc_norm": 0.6809815950920245,
"acc_norm_stderr": 0.03661997551073836
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.6893203883495146,
"acc_stderr": 0.0458212416016155,
"acc_norm": 0.6893203883495146,
"acc_norm_stderr": 0.0458212416016155
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7991452991452992,
"acc_stderr": 0.02624677294689048,
"acc_norm": 0.7991452991452992,
"acc_norm_stderr": 0.02624677294689048
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7177522349936143,
"acc_stderr": 0.016095302969878544,
"acc_norm": 0.7177522349936143,
"acc_norm_stderr": 0.016095302969878544
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.026074314851657083,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.026074314851657083
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4011173184357542,
"acc_stderr": 0.01639222189940708,
"acc_norm": 0.4011173184357542,
"acc_norm_stderr": 0.01639222189940708
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.02845263998508801,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.02845263998508801
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6430868167202572,
"acc_stderr": 0.027210420375934012,
"acc_norm": 0.6430868167202572,
"acc_norm_stderr": 0.027210420375934012
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6049382716049383,
"acc_stderr": 0.027201117666925657,
"acc_norm": 0.6049382716049383,
"acc_norm_stderr": 0.027201117666925657
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4148936170212766,
"acc_stderr": 0.0293922365846125,
"acc_norm": 0.4148936170212766,
"acc_norm_stderr": 0.0293922365846125
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4367666232073012,
"acc_stderr": 0.01266770191960366,
"acc_norm": 0.4367666232073012,
"acc_norm_stderr": 0.01266770191960366
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4485294117647059,
"acc_stderr": 0.030211479609121603,
"acc_norm": 0.4485294117647059,
"acc_norm_stderr": 0.030211479609121603
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.020196594933541197,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.020196594933541197
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6938775510204082,
"acc_stderr": 0.029504896454595957,
"acc_norm": 0.6938775510204082,
"acc_norm_stderr": 0.029504896454595957
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7412935323383084,
"acc_stderr": 0.030965903123573044,
"acc_norm": 0.7412935323383084,
"acc_norm_stderr": 0.030965903123573044
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-virology|5": {
"acc": 0.40963855421686746,
"acc_stderr": 0.03828401115079021,
"acc_norm": 0.40963855421686746,
"acc_norm_stderr": 0.03828401115079021
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7309941520467836,
"acc_stderr": 0.0340105262010409,
"acc_norm": 0.7309941520467836,
"acc_norm_stderr": 0.0340105262010409
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2974296205630355,
"mc1_stderr": 0.016002651487361002,
"mc2": 0.43920006507019715,
"mc2_stderr": 0.014754612740891303
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_openbmb__UltraRM-13b | 2023-10-08T20:47:09.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of openbmb/UltraRM-13b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [openbmb/UltraRM-13b](https://huggingface.co/openbmb/UltraRM-13b) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_openbmb__UltraRM-13b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-08T20:45:47.827028](https://huggingface.co/datasets/open-llm-leaderboard/details_openbmb__UltraRM-13b/blob/main/results_2023-10-08T20-45-47.827028.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2590094885923756,\n\
\ \"acc_stderr\": 0.03189328196381238,\n \"acc_norm\": 0.26002015960682157,\n\
\ \"acc_norm_stderr\": 0.031908858832774556,\n \"mc1\": 0.23133414932680538,\n\
\ \"mc1_stderr\": 0.014761945174862658,\n \"mc2\": 0.4790548760214852,\n\
\ \"mc2_stderr\": 0.016309927278874654\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.22781569965870307,\n \"acc_stderr\": 0.01225670860232692,\n\
\ \"acc_norm\": 0.2815699658703072,\n \"acc_norm_stderr\": 0.013143376735009014\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.25542720573590916,\n\
\ \"acc_stderr\": 0.004352098082984431,\n \"acc_norm\": 0.2613025293766182,\n\
\ \"acc_norm_stderr\": 0.004384465219070759\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.03583496176361064,\n\
\ \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.03583496176361064\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.22,\n\
\ \"acc_stderr\": 0.04163331998932269,\n \"acc_norm\": 0.22,\n \
\ \"acc_norm_stderr\": 0.04163331998932269\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2188679245283019,\n \"acc_stderr\": 0.02544786382510861,\n\
\ \"acc_norm\": 0.2188679245283019,\n \"acc_norm_stderr\": 0.02544786382510861\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2986111111111111,\n\
\ \"acc_stderr\": 0.038270523579507554,\n \"acc_norm\": 0.2986111111111111,\n\
\ \"acc_norm_stderr\": 0.038270523579507554\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621503,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621503\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n\
\ \"acc_stderr\": 0.030952890217749888,\n \"acc_norm\": 0.20809248554913296,\n\
\ \"acc_norm_stderr\": 0.030952890217749888\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171452,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171452\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.20425531914893616,\n \"acc_stderr\": 0.026355158413349424,\n\
\ \"acc_norm\": 0.20425531914893616,\n \"acc_norm_stderr\": 0.026355158413349424\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.0414243971948936,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.0414243971948936\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.296551724137931,\n \"acc_stderr\": 0.03806142687309993,\n\
\ \"acc_norm\": 0.296551724137931,\n \"acc_norm_stderr\": 0.03806142687309993\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.21428571428571427,\n \"acc_stderr\": 0.021132859182754427,\n \"\
acc_norm\": 0.21428571428571427,\n \"acc_norm_stderr\": 0.021132859182754427\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.25396825396825395,\n\
\ \"acc_stderr\": 0.03893259610604673,\n \"acc_norm\": 0.25396825396825395,\n\
\ \"acc_norm_stderr\": 0.03893259610604673\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.24838709677419354,\n\
\ \"acc_stderr\": 0.024580028921481003,\n \"acc_norm\": 0.24838709677419354,\n\
\ \"acc_norm_stderr\": 0.024580028921481003\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.30049261083743845,\n \"acc_stderr\": 0.03225799476233484,\n\
\ \"acc_norm\": 0.30049261083743845,\n \"acc_norm_stderr\": 0.03225799476233484\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\"\
: 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.25757575757575757,\n \"acc_stderr\": 0.03115626951964683,\n \"\
acc_norm\": 0.25757575757575757,\n \"acc_norm_stderr\": 0.03115626951964683\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.20725388601036268,\n \"acc_stderr\": 0.029252823291803638,\n\
\ \"acc_norm\": 0.20725388601036268,\n \"acc_norm_stderr\": 0.029252823291803638\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2128205128205128,\n \"acc_stderr\": 0.020752423722128013,\n\
\ \"acc_norm\": 0.2128205128205128,\n \"acc_norm_stderr\": 0.020752423722128013\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.27037037037037037,\n \"acc_stderr\": 0.027080372815145665,\n \
\ \"acc_norm\": 0.27037037037037037,\n \"acc_norm_stderr\": 0.027080372815145665\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n\
\ \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.03802039760107903,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.03802039760107903\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.26972477064220185,\n \"acc_stderr\": 0.01902848671111544,\n \"\
acc_norm\": 0.26972477064220185,\n \"acc_norm_stderr\": 0.01902848671111544\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3194444444444444,\n \"acc_stderr\": 0.03179876342176852,\n \"\
acc_norm\": 0.3194444444444444,\n \"acc_norm_stderr\": 0.03179876342176852\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2647058823529412,\n \"acc_stderr\": 0.0309645179269234,\n \"acc_norm\"\
: 0.2647058823529412,\n \"acc_norm_stderr\": 0.0309645179269234\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.32489451476793246,\n \"acc_stderr\": 0.030486039389105303,\n \"\
acc_norm\": 0.32489451476793246,\n \"acc_norm_stderr\": 0.030486039389105303\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.20179372197309417,\n\
\ \"acc_stderr\": 0.026936111912802273,\n \"acc_norm\": 0.20179372197309417,\n\
\ \"acc_norm_stderr\": 0.026936111912802273\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.22900763358778625,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.22900763358778625,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2727272727272727,\n \"acc_stderr\": 0.04065578140908705,\n \"\
acc_norm\": 0.2727272727272727,\n \"acc_norm_stderr\": 0.04065578140908705\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.294478527607362,\n \"acc_stderr\": 0.03581165790474082,\n\
\ \"acc_norm\": 0.294478527607362,\n \"acc_norm_stderr\": 0.03581165790474082\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.22321428571428573,\n\
\ \"acc_stderr\": 0.039523019677025116,\n \"acc_norm\": 0.22321428571428573,\n\
\ \"acc_norm_stderr\": 0.039523019677025116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.1941747572815534,\n \"acc_stderr\": 0.039166677628225836,\n\
\ \"acc_norm\": 0.1941747572815534,\n \"acc_norm_stderr\": 0.039166677628225836\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2564102564102564,\n\
\ \"acc_stderr\": 0.02860595370200425,\n \"acc_norm\": 0.2564102564102564,\n\
\ \"acc_norm_stderr\": 0.02860595370200425\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.040201512610368445,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.040201512610368445\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.26947637292464877,\n\
\ \"acc_stderr\": 0.01586624307321506,\n \"acc_norm\": 0.26947637292464877,\n\
\ \"acc_norm_stderr\": 0.01586624307321506\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.28901734104046245,\n \"acc_stderr\": 0.024405173935783238,\n\
\ \"acc_norm\": 0.28901734104046245,\n \"acc_norm_stderr\": 0.024405173935783238\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27039106145251396,\n\
\ \"acc_stderr\": 0.014854993938010088,\n \"acc_norm\": 0.27039106145251396,\n\
\ \"acc_norm_stderr\": 0.014854993938010088\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.026336613469046633,\n\
\ \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.026336613469046633\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2958199356913183,\n\
\ \"acc_stderr\": 0.025922371788818798,\n \"acc_norm\": 0.2958199356913183,\n\
\ \"acc_norm_stderr\": 0.025922371788818798\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.02492200116888633,\n\
\ \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.02492200116888633\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.24468085106382978,\n \"acc_stderr\": 0.025645553622266726,\n \
\ \"acc_norm\": 0.24468085106382978,\n \"acc_norm_stderr\": 0.025645553622266726\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2392438070404172,\n\
\ \"acc_stderr\": 0.01089612365267665,\n \"acc_norm\": 0.2392438070404172,\n\
\ \"acc_norm_stderr\": 0.01089612365267665\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.24632352941176472,\n \"acc_stderr\": 0.02617343857052,\n\
\ \"acc_norm\": 0.24632352941176472,\n \"acc_norm_stderr\": 0.02617343857052\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2369281045751634,\n \"acc_stderr\": 0.017201662169789782,\n \
\ \"acc_norm\": 0.2369281045751634,\n \"acc_norm_stderr\": 0.017201662169789782\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2,\n\
\ \"acc_stderr\": 0.03831305140884603,\n \"acc_norm\": 0.2,\n \
\ \"acc_norm_stderr\": 0.03831305140884603\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.3183673469387755,\n \"acc_stderr\": 0.029822533793982055,\n\
\ \"acc_norm\": 0.3183673469387755,\n \"acc_norm_stderr\": 0.029822533793982055\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2537313432835821,\n\
\ \"acc_stderr\": 0.03076944496729601,\n \"acc_norm\": 0.2537313432835821,\n\
\ \"acc_norm_stderr\": 0.03076944496729601\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.20481927710843373,\n\
\ \"acc_stderr\": 0.03141784291663926,\n \"acc_norm\": 0.20481927710843373,\n\
\ \"acc_norm_stderr\": 0.03141784291663926\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.29239766081871343,\n \"acc_stderr\": 0.034886477134579215,\n\
\ \"acc_norm\": 0.29239766081871343,\n \"acc_norm_stderr\": 0.034886477134579215\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23133414932680538,\n\
\ \"mc1_stderr\": 0.014761945174862658,\n \"mc2\": 0.4790548760214852,\n\
\ \"mc2_stderr\": 0.016309927278874654\n }\n}\n```"
repo_url: https://huggingface.co/openbmb/UltraRM-13b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_08T20_45_47.827028
path:
- '**/details_harness|arc:challenge|25_2023-10-08T20-45-47.827028.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-08T20-45-47.827028.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_08T20_45_47.827028
path:
- '**/details_harness|hellaswag|10_2023-10-08T20-45-47.827028.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-08T20-45-47.827028.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_08T20_45_47.827028
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T20-45-47.827028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T20-45-47.827028.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T20-45-47.827028.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_08T20_45_47.827028
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T20-45-47.827028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T20-45-47.827028.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_08T20_45_47.827028
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T20-45-47.827028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T20-45-47.827028.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_08T20_45_47.827028
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T20-45-47.827028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T20-45-47.827028.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_08T20_45_47.827028
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T20-45-47.827028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T20-45-47.827028.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_08T20_45_47.827028
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T20-45-47.827028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T20-45-47.827028.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_08T20_45_47.827028
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T20-45-47.827028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T20-45-47.827028.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_08T20_45_47.827028
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T20-45-47.827028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T20-45-47.827028.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_08T20_45_47.827028
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T20-45-47.827028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T20-45-47.827028.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_08T20_45_47.827028
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T20-45-47.827028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T20-45-47.827028.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_08T20_45_47.827028
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T20-45-47.827028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T20-45-47.827028.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_08T20_45_47.827028
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T20-45-47.827028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T20-45-47.827028.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_08T20_45_47.827028
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T20-45-47.827028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T20-45-47.827028.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_08T20_45_47.827028
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T20-45-47.827028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T20-45-47.827028.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_08T20_45_47.827028
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T20-45-47.827028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T20-45-47.827028.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_08T20_45_47.827028
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T20-45-47.827028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T20-45-47.827028.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_08T20_45_47.827028
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T20-45-47.827028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T20-45-47.827028.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_08T20_45_47.827028
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T20-45-47.827028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T20-45-47.827028.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_08T20_45_47.827028
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T20-45-47.827028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T20-45-47.827028.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_08T20_45_47.827028
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T20-45-47.827028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T20-45-47.827028.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_08T20_45_47.827028
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T20-45-47.827028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T20-45-47.827028.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_08T20_45_47.827028
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T20-45-47.827028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T20-45-47.827028.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_08T20_45_47.827028
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T20-45-47.827028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T20-45-47.827028.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_08T20_45_47.827028
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T20-45-47.827028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T20-45-47.827028.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_08T20_45_47.827028
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T20-45-47.827028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T20-45-47.827028.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_08T20_45_47.827028
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T20-45-47.827028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T20-45-47.827028.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_08T20_45_47.827028
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T20-45-47.827028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T20-45-47.827028.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_08T20_45_47.827028
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T20-45-47.827028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T20-45-47.827028.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_08T20_45_47.827028
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T20-45-47.827028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T20-45-47.827028.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_08T20_45_47.827028
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T20-45-47.827028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T20-45-47.827028.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_08T20_45_47.827028
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T20-45-47.827028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T20-45-47.827028.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_08T20_45_47.827028
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T20-45-47.827028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T20-45-47.827028.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_08T20_45_47.827028
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T20-45-47.827028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T20-45-47.827028.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_08T20_45_47.827028
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T20-45-47.827028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T20-45-47.827028.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_08T20_45_47.827028
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T20-45-47.827028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T20-45-47.827028.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_08T20_45_47.827028
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T20-45-47.827028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T20-45-47.827028.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_08T20_45_47.827028
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T20-45-47.827028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T20-45-47.827028.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_08T20_45_47.827028
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T20-45-47.827028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T20-45-47.827028.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_08T20_45_47.827028
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T20-45-47.827028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T20-45-47.827028.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_08T20_45_47.827028
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-08T20-45-47.827028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-08T20-45-47.827028.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_08T20_45_47.827028
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T20-45-47.827028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T20-45-47.827028.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_08T20_45_47.827028
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T20-45-47.827028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T20-45-47.827028.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_08T20_45_47.827028
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T20-45-47.827028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T20-45-47.827028.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_08T20_45_47.827028
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T20-45-47.827028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T20-45-47.827028.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_08T20_45_47.827028
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T20-45-47.827028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T20-45-47.827028.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_08T20_45_47.827028
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T20-45-47.827028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T20-45-47.827028.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_08T20_45_47.827028
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T20-45-47.827028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T20-45-47.827028.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_08T20_45_47.827028
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T20-45-47.827028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T20-45-47.827028.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_08T20_45_47.827028
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T20-45-47.827028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T20-45-47.827028.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_08T20_45_47.827028
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T20-45-47.827028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T20-45-47.827028.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_08T20_45_47.827028
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T20-45-47.827028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T20-45-47.827028.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_08T20_45_47.827028
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T20-45-47.827028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T20-45-47.827028.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_08T20_45_47.827028
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T20-45-47.827028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T20-45-47.827028.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_08T20_45_47.827028
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T20-45-47.827028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T20-45-47.827028.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_08T20_45_47.827028
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T20-45-47.827028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T20-45-47.827028.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_08T20_45_47.827028
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T20-45-47.827028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T20-45-47.827028.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_08T20_45_47.827028
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T20-45-47.827028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T20-45-47.827028.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_08T20_45_47.827028
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T20-45-47.827028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T20-45-47.827028.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_08T20_45_47.827028
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-08T20-45-47.827028.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-08T20-45-47.827028.parquet'
- config_name: results
data_files:
- split: 2023_10_08T20_45_47.827028
path:
- results_2023-10-08T20-45-47.827028.parquet
- split: latest
path:
- results_2023-10-08T20-45-47.827028.parquet
---
# Dataset Card for Evaluation run of openbmb/UltraRM-13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/openbmb/UltraRM-13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [openbmb/UltraRM-13b](https://huggingface.co/openbmb/UltraRM-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_openbmb__UltraRM-13b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-08T20:45:47.827028](https://huggingface.co/datasets/open-llm-leaderboard/details_openbmb__UltraRM-13b/blob/main/results_2023-10-08T20-45-47.827028.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2590094885923756,
"acc_stderr": 0.03189328196381238,
"acc_norm": 0.26002015960682157,
"acc_norm_stderr": 0.031908858832774556,
"mc1": 0.23133414932680538,
"mc1_stderr": 0.014761945174862658,
"mc2": 0.4790548760214852,
"mc2_stderr": 0.016309927278874654
},
"harness|arc:challenge|25": {
"acc": 0.22781569965870307,
"acc_stderr": 0.01225670860232692,
"acc_norm": 0.2815699658703072,
"acc_norm_stderr": 0.013143376735009014
},
"harness|hellaswag|10": {
"acc": 0.25542720573590916,
"acc_stderr": 0.004352098082984431,
"acc_norm": 0.2613025293766182,
"acc_norm_stderr": 0.004384465219070759
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.03583496176361064,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.03583496176361064
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2188679245283019,
"acc_stderr": 0.02544786382510861,
"acc_norm": 0.2188679245283019,
"acc_norm_stderr": 0.02544786382510861
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2986111111111111,
"acc_stderr": 0.038270523579507554,
"acc_norm": 0.2986111111111111,
"acc_norm_stderr": 0.038270523579507554
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621503,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621503
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749888,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749888
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.04220773659171452,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.04220773659171452
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.20425531914893616,
"acc_stderr": 0.026355158413349424,
"acc_norm": 0.20425531914893616,
"acc_norm_stderr": 0.026355158413349424
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.0414243971948936,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.0414243971948936
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.296551724137931,
"acc_stderr": 0.03806142687309993,
"acc_norm": 0.296551724137931,
"acc_norm_stderr": 0.03806142687309993
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.021132859182754427,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.021132859182754427
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.03893259610604673,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.03893259610604673
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.24838709677419354,
"acc_stderr": 0.024580028921481003,
"acc_norm": 0.24838709677419354,
"acc_norm_stderr": 0.024580028921481003
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.30049261083743845,
"acc_stderr": 0.03225799476233484,
"acc_norm": 0.30049261083743845,
"acc_norm_stderr": 0.03225799476233484
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.25757575757575757,
"acc_stderr": 0.03115626951964683,
"acc_norm": 0.25757575757575757,
"acc_norm_stderr": 0.03115626951964683
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.20725388601036268,
"acc_stderr": 0.029252823291803638,
"acc_norm": 0.20725388601036268,
"acc_norm_stderr": 0.029252823291803638
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2128205128205128,
"acc_stderr": 0.020752423722128013,
"acc_norm": 0.2128205128205128,
"acc_norm_stderr": 0.020752423722128013
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.27037037037037037,
"acc_stderr": 0.027080372815145665,
"acc_norm": 0.27037037037037037,
"acc_norm_stderr": 0.027080372815145665
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.03802039760107903,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.03802039760107903
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.26972477064220185,
"acc_stderr": 0.01902848671111544,
"acc_norm": 0.26972477064220185,
"acc_norm_stderr": 0.01902848671111544
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3194444444444444,
"acc_stderr": 0.03179876342176852,
"acc_norm": 0.3194444444444444,
"acc_norm_stderr": 0.03179876342176852
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.0309645179269234,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.0309645179269234
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.32489451476793246,
"acc_stderr": 0.030486039389105303,
"acc_norm": 0.32489451476793246,
"acc_norm_stderr": 0.030486039389105303
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.20179372197309417,
"acc_stderr": 0.026936111912802273,
"acc_norm": 0.20179372197309417,
"acc_norm_stderr": 0.026936111912802273
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.22900763358778625,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.22900763358778625,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2727272727272727,
"acc_stderr": 0.04065578140908705,
"acc_norm": 0.2727272727272727,
"acc_norm_stderr": 0.04065578140908705
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.294478527607362,
"acc_stderr": 0.03581165790474082,
"acc_norm": 0.294478527607362,
"acc_norm_stderr": 0.03581165790474082
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.22321428571428573,
"acc_stderr": 0.039523019677025116,
"acc_norm": 0.22321428571428573,
"acc_norm_stderr": 0.039523019677025116
},
"harness|hendrycksTest-management|5": {
"acc": 0.1941747572815534,
"acc_stderr": 0.039166677628225836,
"acc_norm": 0.1941747572815534,
"acc_norm_stderr": 0.039166677628225836
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2564102564102564,
"acc_stderr": 0.02860595370200425,
"acc_norm": 0.2564102564102564,
"acc_norm_stderr": 0.02860595370200425
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.2,
"acc_stderr": 0.040201512610368445,
"acc_norm": 0.2,
"acc_norm_stderr": 0.040201512610368445
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.26947637292464877,
"acc_stderr": 0.01586624307321506,
"acc_norm": 0.26947637292464877,
"acc_norm_stderr": 0.01586624307321506
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.28901734104046245,
"acc_stderr": 0.024405173935783238,
"acc_norm": 0.28901734104046245,
"acc_norm_stderr": 0.024405173935783238
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27039106145251396,
"acc_stderr": 0.014854993938010088,
"acc_norm": 0.27039106145251396,
"acc_norm_stderr": 0.014854993938010088
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.026336613469046633,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.026336613469046633
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2958199356913183,
"acc_stderr": 0.025922371788818798,
"acc_norm": 0.2958199356913183,
"acc_norm_stderr": 0.025922371788818798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.02492200116888633,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.02492200116888633
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.24468085106382978,
"acc_stderr": 0.025645553622266726,
"acc_norm": 0.24468085106382978,
"acc_norm_stderr": 0.025645553622266726
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2392438070404172,
"acc_stderr": 0.01089612365267665,
"acc_norm": 0.2392438070404172,
"acc_norm_stderr": 0.01089612365267665
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.24632352941176472,
"acc_stderr": 0.02617343857052,
"acc_norm": 0.24632352941176472,
"acc_norm_stderr": 0.02617343857052
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2369281045751634,
"acc_stderr": 0.017201662169789782,
"acc_norm": 0.2369281045751634,
"acc_norm_stderr": 0.017201662169789782
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2,
"acc_stderr": 0.03831305140884603,
"acc_norm": 0.2,
"acc_norm_stderr": 0.03831305140884603
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3183673469387755,
"acc_stderr": 0.029822533793982055,
"acc_norm": 0.3183673469387755,
"acc_norm_stderr": 0.029822533793982055
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.2537313432835821,
"acc_stderr": 0.03076944496729601,
"acc_norm": 0.2537313432835821,
"acc_norm_stderr": 0.03076944496729601
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-virology|5": {
"acc": 0.20481927710843373,
"acc_stderr": 0.03141784291663926,
"acc_norm": 0.20481927710843373,
"acc_norm_stderr": 0.03141784291663926
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.29239766081871343,
"acc_stderr": 0.034886477134579215,
"acc_norm": 0.29239766081871343,
"acc_norm_stderr": 0.034886477134579215
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23133414932680538,
"mc1_stderr": 0.014761945174862658,
"mc2": 0.4790548760214852,
"mc2_stderr": 0.016309927278874654
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_itsliupeng__llama2_7b_code | 2023-10-08T20:47:49.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of itsliupeng/llama2_7b_code
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [itsliupeng/llama2_7b_code](https://huggingface.co/itsliupeng/llama2_7b_code)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_itsliupeng__llama2_7b_code\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-08T20:46:27.226805](https://huggingface.co/datasets/open-llm-leaderboard/details_itsliupeng__llama2_7b_code/blob/main/results_2023-10-08T20-46-27.226805.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.48200491255702765,\n\
\ \"acc_stderr\": 0.03542377047214926,\n \"acc_norm\": 0.48588814598582436,\n\
\ \"acc_norm_stderr\": 0.03541224257584247,\n \"mc1\": 0.2521419828641371,\n\
\ \"mc1_stderr\": 0.01520152224629997,\n \"mc2\": 0.3875988110597644,\n\
\ \"mc2_stderr\": 0.014080197843816668\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4872013651877133,\n \"acc_stderr\": 0.014606603181012538,\n\
\ \"acc_norm\": 0.5213310580204779,\n \"acc_norm_stderr\": 0.014598087973127108\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5621390161322446,\n\
\ \"acc_stderr\": 0.00495109780277595,\n \"acc_norm\": 0.7571200955984864,\n\
\ \"acc_norm_stderr\": 0.0042794671285607475\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847415,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847415\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4888888888888889,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.4888888888888889,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4605263157894737,\n \"acc_stderr\": 0.04056242252249034,\n\
\ \"acc_norm\": 0.4605263157894737,\n \"acc_norm_stderr\": 0.04056242252249034\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.37,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4679245283018868,\n \"acc_stderr\": 0.030709486992556545,\n\
\ \"acc_norm\": 0.4679245283018868,\n \"acc_norm_stderr\": 0.030709486992556545\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04148415739394154,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04148415739394154\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.42196531791907516,\n\
\ \"acc_stderr\": 0.0376574669386515,\n \"acc_norm\": 0.42196531791907516,\n\
\ \"acc_norm_stderr\": 0.0376574669386515\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.044405219061793275,\n\
\ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.044405219061793275\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4340425531914894,\n \"acc_stderr\": 0.032400380867927465,\n\
\ \"acc_norm\": 0.4340425531914894,\n \"acc_norm_stderr\": 0.032400380867927465\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.32456140350877194,\n\
\ \"acc_stderr\": 0.04404556157374767,\n \"acc_norm\": 0.32456140350877194,\n\
\ \"acc_norm_stderr\": 0.04404556157374767\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.503448275862069,\n \"acc_stderr\": 0.04166567577101579,\n\
\ \"acc_norm\": 0.503448275862069,\n \"acc_norm_stderr\": 0.04166567577101579\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.31216931216931215,\n \"acc_stderr\": 0.023865206836972595,\n \"\
acc_norm\": 0.31216931216931215,\n \"acc_norm_stderr\": 0.023865206836972595\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30158730158730157,\n\
\ \"acc_stderr\": 0.04104947269903394,\n \"acc_norm\": 0.30158730158730157,\n\
\ \"acc_norm_stderr\": 0.04104947269903394\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.5193548387096775,\n \"acc_stderr\": 0.02842268740431211,\n \"\
acc_norm\": 0.5193548387096775,\n \"acc_norm_stderr\": 0.02842268740431211\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.30049261083743845,\n \"acc_stderr\": 0.03225799476233484,\n \"\
acc_norm\": 0.30049261083743845,\n \"acc_norm_stderr\": 0.03225799476233484\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\"\
: 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6303030303030303,\n \"acc_stderr\": 0.03769430314512567,\n\
\ \"acc_norm\": 0.6303030303030303,\n \"acc_norm_stderr\": 0.03769430314512567\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5808080808080808,\n \"acc_stderr\": 0.03515520728670417,\n \"\
acc_norm\": 0.5808080808080808,\n \"acc_norm_stderr\": 0.03515520728670417\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6528497409326425,\n \"acc_stderr\": 0.03435696168361355,\n\
\ \"acc_norm\": 0.6528497409326425,\n \"acc_norm_stderr\": 0.03435696168361355\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.41794871794871796,\n \"acc_stderr\": 0.02500732988246122,\n\
\ \"acc_norm\": 0.41794871794871796,\n \"acc_norm_stderr\": 0.02500732988246122\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3111111111111111,\n \"acc_stderr\": 0.02822644674968352,\n \
\ \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.02822644674968352\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.42016806722689076,\n \"acc_stderr\": 0.03206183783236152,\n\
\ \"acc_norm\": 0.42016806722689076,\n \"acc_norm_stderr\": 0.03206183783236152\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719198,\n \"\
acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719198\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.636697247706422,\n \"acc_stderr\": 0.020620603919625804,\n \"\
acc_norm\": 0.636697247706422,\n \"acc_norm_stderr\": 0.020620603919625804\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.30092592592592593,\n \"acc_stderr\": 0.03128039084329882,\n \"\
acc_norm\": 0.30092592592592593,\n \"acc_norm_stderr\": 0.03128039084329882\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5882352941176471,\n \"acc_stderr\": 0.03454236585380608,\n \"\
acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.03454236585380608\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6708860759493671,\n \"acc_stderr\": 0.030587326294702365,\n \
\ \"acc_norm\": 0.6708860759493671,\n \"acc_norm_stderr\": 0.030587326294702365\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5112107623318386,\n\
\ \"acc_stderr\": 0.033549366530984746,\n \"acc_norm\": 0.5112107623318386,\n\
\ \"acc_norm_stderr\": 0.033549366530984746\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5877862595419847,\n \"acc_stderr\": 0.04317171194870254,\n\
\ \"acc_norm\": 0.5877862595419847,\n \"acc_norm_stderr\": 0.04317171194870254\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6115702479338843,\n \"acc_stderr\": 0.04449270350068382,\n \"\
acc_norm\": 0.6115702479338843,\n \"acc_norm_stderr\": 0.04449270350068382\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5370370370370371,\n\
\ \"acc_stderr\": 0.04820403072760627,\n \"acc_norm\": 0.5370370370370371,\n\
\ \"acc_norm_stderr\": 0.04820403072760627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5276073619631901,\n \"acc_stderr\": 0.0392237829061099,\n\
\ \"acc_norm\": 0.5276073619631901,\n \"acc_norm_stderr\": 0.0392237829061099\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n\
\ \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.39285714285714285,\n\
\ \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6116504854368932,\n \"acc_stderr\": 0.048257293373563895,\n\
\ \"acc_norm\": 0.6116504854368932,\n \"acc_norm_stderr\": 0.048257293373563895\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7051282051282052,\n\
\ \"acc_stderr\": 0.029872577708891183,\n \"acc_norm\": 0.7051282051282052,\n\
\ \"acc_norm_stderr\": 0.029872577708891183\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6309067688378033,\n\
\ \"acc_stderr\": 0.017256283109124616,\n \"acc_norm\": 0.6309067688378033,\n\
\ \"acc_norm_stderr\": 0.017256283109124616\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5491329479768786,\n \"acc_stderr\": 0.02678881193156275,\n\
\ \"acc_norm\": 0.5491329479768786,\n \"acc_norm_stderr\": 0.02678881193156275\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.31843575418994413,\n\
\ \"acc_stderr\": 0.015581008080360276,\n \"acc_norm\": 0.31843575418994413,\n\
\ \"acc_norm_stderr\": 0.015581008080360276\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.02845263998508801,\n\
\ \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.02845263998508801\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5916398713826366,\n\
\ \"acc_stderr\": 0.02791705074848462,\n \"acc_norm\": 0.5916398713826366,\n\
\ \"acc_norm_stderr\": 0.02791705074848462\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5216049382716049,\n \"acc_stderr\": 0.02779476010500873,\n\
\ \"acc_norm\": 0.5216049382716049,\n \"acc_norm_stderr\": 0.02779476010500873\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3475177304964539,\n \"acc_stderr\": 0.02840662780959095,\n \
\ \"acc_norm\": 0.3475177304964539,\n \"acc_norm_stderr\": 0.02840662780959095\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.34810951760104303,\n\
\ \"acc_stderr\": 0.012166738993698203,\n \"acc_norm\": 0.34810951760104303,\n\
\ \"acc_norm_stderr\": 0.012166738993698203\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5036764705882353,\n \"acc_stderr\": 0.030372015885428195,\n\
\ \"acc_norm\": 0.5036764705882353,\n \"acc_norm_stderr\": 0.030372015885428195\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.42483660130718953,\n \"acc_stderr\": 0.01999797303545833,\n \
\ \"acc_norm\": 0.42483660130718953,\n \"acc_norm_stderr\": 0.01999797303545833\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5181818181818182,\n\
\ \"acc_stderr\": 0.04785964010794916,\n \"acc_norm\": 0.5181818181818182,\n\
\ \"acc_norm_stderr\": 0.04785964010794916\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.49387755102040815,\n \"acc_stderr\": 0.03200682020163908,\n\
\ \"acc_norm\": 0.49387755102040815,\n \"acc_norm_stderr\": 0.03200682020163908\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6417910447761194,\n\
\ \"acc_stderr\": 0.03390393042268814,\n \"acc_norm\": 0.6417910447761194,\n\
\ \"acc_norm_stderr\": 0.03390393042268814\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4036144578313253,\n\
\ \"acc_stderr\": 0.038194861407583984,\n \"acc_norm\": 0.4036144578313253,\n\
\ \"acc_norm_stderr\": 0.038194861407583984\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.03565079670708312,\n\
\ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.03565079670708312\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2521419828641371,\n\
\ \"mc1_stderr\": 0.01520152224629997,\n \"mc2\": 0.3875988110597644,\n\
\ \"mc2_stderr\": 0.014080197843816668\n }\n}\n```"
repo_url: https://huggingface.co/itsliupeng/llama2_7b_code
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_08T20_46_27.226805
path:
- '**/details_harness|arc:challenge|25_2023-10-08T20-46-27.226805.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-08T20-46-27.226805.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_08T20_46_27.226805
path:
- '**/details_harness|hellaswag|10_2023-10-08T20-46-27.226805.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-08T20-46-27.226805.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_08T20_46_27.226805
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T20-46-27.226805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T20-46-27.226805.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T20-46-27.226805.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_08T20_46_27.226805
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T20-46-27.226805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T20-46-27.226805.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_08T20_46_27.226805
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T20-46-27.226805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T20-46-27.226805.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_08T20_46_27.226805
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T20-46-27.226805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T20-46-27.226805.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_08T20_46_27.226805
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T20-46-27.226805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T20-46-27.226805.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_08T20_46_27.226805
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T20-46-27.226805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T20-46-27.226805.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_08T20_46_27.226805
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T20-46-27.226805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T20-46-27.226805.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_08T20_46_27.226805
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T20-46-27.226805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T20-46-27.226805.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_08T20_46_27.226805
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T20-46-27.226805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T20-46-27.226805.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_08T20_46_27.226805
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T20-46-27.226805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T20-46-27.226805.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_08T20_46_27.226805
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T20-46-27.226805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T20-46-27.226805.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_08T20_46_27.226805
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T20-46-27.226805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T20-46-27.226805.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_08T20_46_27.226805
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T20-46-27.226805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T20-46-27.226805.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_08T20_46_27.226805
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T20-46-27.226805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T20-46-27.226805.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_08T20_46_27.226805
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T20-46-27.226805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T20-46-27.226805.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_08T20_46_27.226805
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T20-46-27.226805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T20-46-27.226805.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_08T20_46_27.226805
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T20-46-27.226805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T20-46-27.226805.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_08T20_46_27.226805
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T20-46-27.226805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T20-46-27.226805.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_08T20_46_27.226805
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T20-46-27.226805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T20-46-27.226805.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_08T20_46_27.226805
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T20-46-27.226805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T20-46-27.226805.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_08T20_46_27.226805
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T20-46-27.226805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T20-46-27.226805.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_08T20_46_27.226805
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T20-46-27.226805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T20-46-27.226805.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_08T20_46_27.226805
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T20-46-27.226805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T20-46-27.226805.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_08T20_46_27.226805
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T20-46-27.226805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T20-46-27.226805.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_08T20_46_27.226805
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T20-46-27.226805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T20-46-27.226805.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_08T20_46_27.226805
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T20-46-27.226805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T20-46-27.226805.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_08T20_46_27.226805
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T20-46-27.226805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T20-46-27.226805.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_08T20_46_27.226805
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T20-46-27.226805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T20-46-27.226805.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_08T20_46_27.226805
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T20-46-27.226805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T20-46-27.226805.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_08T20_46_27.226805
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T20-46-27.226805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T20-46-27.226805.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_08T20_46_27.226805
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T20-46-27.226805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T20-46-27.226805.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_08T20_46_27.226805
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T20-46-27.226805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T20-46-27.226805.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_08T20_46_27.226805
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T20-46-27.226805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T20-46-27.226805.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_08T20_46_27.226805
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T20-46-27.226805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T20-46-27.226805.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_08T20_46_27.226805
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T20-46-27.226805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T20-46-27.226805.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_08T20_46_27.226805
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T20-46-27.226805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T20-46-27.226805.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_08T20_46_27.226805
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T20-46-27.226805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T20-46-27.226805.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_08T20_46_27.226805
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T20-46-27.226805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T20-46-27.226805.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_08T20_46_27.226805
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T20-46-27.226805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T20-46-27.226805.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_08T20_46_27.226805
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-08T20-46-27.226805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-08T20-46-27.226805.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_08T20_46_27.226805
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T20-46-27.226805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T20-46-27.226805.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_08T20_46_27.226805
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T20-46-27.226805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T20-46-27.226805.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_08T20_46_27.226805
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T20-46-27.226805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T20-46-27.226805.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_08T20_46_27.226805
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T20-46-27.226805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T20-46-27.226805.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_08T20_46_27.226805
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T20-46-27.226805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T20-46-27.226805.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_08T20_46_27.226805
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T20-46-27.226805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T20-46-27.226805.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_08T20_46_27.226805
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T20-46-27.226805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T20-46-27.226805.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_08T20_46_27.226805
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T20-46-27.226805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T20-46-27.226805.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_08T20_46_27.226805
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T20-46-27.226805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T20-46-27.226805.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_08T20_46_27.226805
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T20-46-27.226805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T20-46-27.226805.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_08T20_46_27.226805
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T20-46-27.226805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T20-46-27.226805.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_08T20_46_27.226805
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T20-46-27.226805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T20-46-27.226805.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_08T20_46_27.226805
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T20-46-27.226805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T20-46-27.226805.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_08T20_46_27.226805
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T20-46-27.226805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T20-46-27.226805.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_08T20_46_27.226805
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T20-46-27.226805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T20-46-27.226805.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_08T20_46_27.226805
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T20-46-27.226805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T20-46-27.226805.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_08T20_46_27.226805
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T20-46-27.226805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T20-46-27.226805.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_08T20_46_27.226805
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T20-46-27.226805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T20-46-27.226805.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_08T20_46_27.226805
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-08T20-46-27.226805.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-08T20-46-27.226805.parquet'
- config_name: results
data_files:
- split: 2023_10_08T20_46_27.226805
path:
- results_2023-10-08T20-46-27.226805.parquet
- split: latest
path:
- results_2023-10-08T20-46-27.226805.parquet
---
# Dataset Card for Evaluation run of itsliupeng/llama2_7b_code
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/itsliupeng/llama2_7b_code
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [itsliupeng/llama2_7b_code](https://huggingface.co/itsliupeng/llama2_7b_code) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_itsliupeng__llama2_7b_code",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-08T20:46:27.226805](https://huggingface.co/datasets/open-llm-leaderboard/details_itsliupeng__llama2_7b_code/blob/main/results_2023-10-08T20-46-27.226805.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.48200491255702765,
"acc_stderr": 0.03542377047214926,
"acc_norm": 0.48588814598582436,
"acc_norm_stderr": 0.03541224257584247,
"mc1": 0.2521419828641371,
"mc1_stderr": 0.01520152224629997,
"mc2": 0.3875988110597644,
"mc2_stderr": 0.014080197843816668
},
"harness|arc:challenge|25": {
"acc": 0.4872013651877133,
"acc_stderr": 0.014606603181012538,
"acc_norm": 0.5213310580204779,
"acc_norm_stderr": 0.014598087973127108
},
"harness|hellaswag|10": {
"acc": 0.5621390161322446,
"acc_stderr": 0.00495109780277595,
"acc_norm": 0.7571200955984864,
"acc_norm_stderr": 0.0042794671285607475
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847415,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847415
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4888888888888889,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.4888888888888889,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4605263157894737,
"acc_stderr": 0.04056242252249034,
"acc_norm": 0.4605263157894737,
"acc_norm_stderr": 0.04056242252249034
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4679245283018868,
"acc_stderr": 0.030709486992556545,
"acc_norm": 0.4679245283018868,
"acc_norm_stderr": 0.030709486992556545
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4375,
"acc_stderr": 0.04148415739394154,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04148415739394154
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.42196531791907516,
"acc_stderr": 0.0376574669386515,
"acc_norm": 0.42196531791907516,
"acc_norm_stderr": 0.0376574669386515
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.044405219061793275,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.044405219061793275
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4340425531914894,
"acc_stderr": 0.032400380867927465,
"acc_norm": 0.4340425531914894,
"acc_norm_stderr": 0.032400380867927465
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.32456140350877194,
"acc_stderr": 0.04404556157374767,
"acc_norm": 0.32456140350877194,
"acc_norm_stderr": 0.04404556157374767
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.503448275862069,
"acc_stderr": 0.04166567577101579,
"acc_norm": 0.503448275862069,
"acc_norm_stderr": 0.04166567577101579
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.31216931216931215,
"acc_stderr": 0.023865206836972595,
"acc_norm": 0.31216931216931215,
"acc_norm_stderr": 0.023865206836972595
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30158730158730157,
"acc_stderr": 0.04104947269903394,
"acc_norm": 0.30158730158730157,
"acc_norm_stderr": 0.04104947269903394
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5193548387096775,
"acc_stderr": 0.02842268740431211,
"acc_norm": 0.5193548387096775,
"acc_norm_stderr": 0.02842268740431211
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.30049261083743845,
"acc_stderr": 0.03225799476233484,
"acc_norm": 0.30049261083743845,
"acc_norm_stderr": 0.03225799476233484
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6303030303030303,
"acc_stderr": 0.03769430314512567,
"acc_norm": 0.6303030303030303,
"acc_norm_stderr": 0.03769430314512567
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5808080808080808,
"acc_stderr": 0.03515520728670417,
"acc_norm": 0.5808080808080808,
"acc_norm_stderr": 0.03515520728670417
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6528497409326425,
"acc_stderr": 0.03435696168361355,
"acc_norm": 0.6528497409326425,
"acc_norm_stderr": 0.03435696168361355
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.41794871794871796,
"acc_stderr": 0.02500732988246122,
"acc_norm": 0.41794871794871796,
"acc_norm_stderr": 0.02500732988246122
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.02822644674968352,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.02822644674968352
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.42016806722689076,
"acc_stderr": 0.03206183783236152,
"acc_norm": 0.42016806722689076,
"acc_norm_stderr": 0.03206183783236152
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3841059602649007,
"acc_stderr": 0.03971301814719198,
"acc_norm": 0.3841059602649007,
"acc_norm_stderr": 0.03971301814719198
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.636697247706422,
"acc_stderr": 0.020620603919625804,
"acc_norm": 0.636697247706422,
"acc_norm_stderr": 0.020620603919625804
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.30092592592592593,
"acc_stderr": 0.03128039084329882,
"acc_norm": 0.30092592592592593,
"acc_norm_stderr": 0.03128039084329882
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5882352941176471,
"acc_stderr": 0.03454236585380608,
"acc_norm": 0.5882352941176471,
"acc_norm_stderr": 0.03454236585380608
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6708860759493671,
"acc_stderr": 0.030587326294702365,
"acc_norm": 0.6708860759493671,
"acc_norm_stderr": 0.030587326294702365
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5112107623318386,
"acc_stderr": 0.033549366530984746,
"acc_norm": 0.5112107623318386,
"acc_norm_stderr": 0.033549366530984746
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5877862595419847,
"acc_stderr": 0.04317171194870254,
"acc_norm": 0.5877862595419847,
"acc_norm_stderr": 0.04317171194870254
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6115702479338843,
"acc_stderr": 0.04449270350068382,
"acc_norm": 0.6115702479338843,
"acc_norm_stderr": 0.04449270350068382
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.04820403072760627,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.04820403072760627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5276073619631901,
"acc_stderr": 0.0392237829061099,
"acc_norm": 0.5276073619631901,
"acc_norm_stderr": 0.0392237829061099
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.39285714285714285,
"acc_stderr": 0.04635550135609976,
"acc_norm": 0.39285714285714285,
"acc_norm_stderr": 0.04635550135609976
},
"harness|hendrycksTest-management|5": {
"acc": 0.6116504854368932,
"acc_stderr": 0.048257293373563895,
"acc_norm": 0.6116504854368932,
"acc_norm_stderr": 0.048257293373563895
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7051282051282052,
"acc_stderr": 0.029872577708891183,
"acc_norm": 0.7051282051282052,
"acc_norm_stderr": 0.029872577708891183
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6309067688378033,
"acc_stderr": 0.017256283109124616,
"acc_norm": 0.6309067688378033,
"acc_norm_stderr": 0.017256283109124616
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5491329479768786,
"acc_stderr": 0.02678881193156275,
"acc_norm": 0.5491329479768786,
"acc_norm_stderr": 0.02678881193156275
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.31843575418994413,
"acc_stderr": 0.015581008080360276,
"acc_norm": 0.31843575418994413,
"acc_norm_stderr": 0.015581008080360276
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.02845263998508801,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.02845263998508801
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5916398713826366,
"acc_stderr": 0.02791705074848462,
"acc_norm": 0.5916398713826366,
"acc_norm_stderr": 0.02791705074848462
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5216049382716049,
"acc_stderr": 0.02779476010500873,
"acc_norm": 0.5216049382716049,
"acc_norm_stderr": 0.02779476010500873
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3475177304964539,
"acc_stderr": 0.02840662780959095,
"acc_norm": 0.3475177304964539,
"acc_norm_stderr": 0.02840662780959095
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.34810951760104303,
"acc_stderr": 0.012166738993698203,
"acc_norm": 0.34810951760104303,
"acc_norm_stderr": 0.012166738993698203
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5036764705882353,
"acc_stderr": 0.030372015885428195,
"acc_norm": 0.5036764705882353,
"acc_norm_stderr": 0.030372015885428195
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.42483660130718953,
"acc_stderr": 0.01999797303545833,
"acc_norm": 0.42483660130718953,
"acc_norm_stderr": 0.01999797303545833
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5181818181818182,
"acc_stderr": 0.04785964010794916,
"acc_norm": 0.5181818181818182,
"acc_norm_stderr": 0.04785964010794916
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.49387755102040815,
"acc_stderr": 0.03200682020163908,
"acc_norm": 0.49387755102040815,
"acc_norm_stderr": 0.03200682020163908
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6417910447761194,
"acc_stderr": 0.03390393042268814,
"acc_norm": 0.6417910447761194,
"acc_norm_stderr": 0.03390393042268814
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4036144578313253,
"acc_stderr": 0.038194861407583984,
"acc_norm": 0.4036144578313253,
"acc_norm_stderr": 0.038194861407583984
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.03565079670708312,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.03565079670708312
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2521419828641371,
"mc1_stderr": 0.01520152224629997,
"mc2": 0.3875988110597644,
"mc2_stderr": 0.014080197843816668
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_Undi95__MLewd-ReMM-L2-Chat-20B-Inverted | 2023-10-08T21:14:32.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Undi95/MLewd-ReMM-L2-Chat-20B-Inverted
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Undi95/MLewd-ReMM-L2-Chat-20B-Inverted](https://huggingface.co/Undi95/MLewd-ReMM-L2-Chat-20B-Inverted)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Undi95__MLewd-ReMM-L2-Chat-20B-Inverted\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-08T21:13:04.392733](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__MLewd-ReMM-L2-Chat-20B-Inverted/blob/main/results_2023-10-08T21-13-04.392733.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5816164336009417,\n\
\ \"acc_stderr\": 0.03420587695783267,\n \"acc_norm\": 0.5852291914116693,\n\
\ \"acc_norm_stderr\": 0.034183268720349165,\n \"mc1\": 0.3733170134638923,\n\
\ \"mc1_stderr\": 0.01693237055757063,\n \"mc2\": 0.5376513807870049,\n\
\ \"mc2_stderr\": 0.015911441950703046\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5921501706484642,\n \"acc_stderr\": 0.014361097288449693,\n\
\ \"acc_norm\": 0.6168941979522184,\n \"acc_norm_stderr\": 0.014206472661672877\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6648078072097192,\n\
\ \"acc_stderr\": 0.004710928569985762,\n \"acc_norm\": 0.8532164907388966,\n\
\ \"acc_norm_stderr\": 0.003531667185235823\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n\
\ \"acc_stderr\": 0.04313531696750574,\n \"acc_norm\": 0.4740740740740741,\n\
\ \"acc_norm_stderr\": 0.04313531696750574\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5394736842105263,\n \"acc_stderr\": 0.04056242252249033,\n\
\ \"acc_norm\": 0.5394736842105263,\n \"acc_norm_stderr\": 0.04056242252249033\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6188679245283019,\n \"acc_stderr\": 0.029890609686286644,\n\
\ \"acc_norm\": 0.6188679245283019,\n \"acc_norm_stderr\": 0.029890609686286644\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6736111111111112,\n\
\ \"acc_stderr\": 0.03921067198982266,\n \"acc_norm\": 0.6736111111111112,\n\
\ \"acc_norm_stderr\": 0.03921067198982266\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n\
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5549132947976878,\n\
\ \"acc_stderr\": 0.03789401760283647,\n \"acc_norm\": 0.5549132947976878,\n\
\ \"acc_norm_stderr\": 0.03789401760283647\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201942,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201942\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\": 0.67,\n\
\ \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.49361702127659574,\n \"acc_stderr\": 0.032683358999363366,\n\
\ \"acc_norm\": 0.49361702127659574,\n \"acc_norm_stderr\": 0.032683358999363366\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\
\ \"acc_stderr\": 0.04049339297748141,\n \"acc_norm\": 0.24561403508771928,\n\
\ \"acc_norm_stderr\": 0.04049339297748141\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.335978835978836,\n \"acc_stderr\": 0.02432631052914915,\n \"acc_norm\"\
: 0.335978835978836,\n \"acc_norm_stderr\": 0.02432631052914915\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n\
\ \"acc_stderr\": 0.04403438954768177,\n \"acc_norm\": 0.4126984126984127,\n\
\ \"acc_norm_stderr\": 0.04403438954768177\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6645161290322581,\n \"acc_stderr\": 0.02686020644472435,\n \"\
acc_norm\": 0.6645161290322581,\n \"acc_norm_stderr\": 0.02686020644472435\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n \"\
acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7212121212121212,\n \"acc_stderr\": 0.035014387062967806,\n\
\ \"acc_norm\": 0.7212121212121212,\n \"acc_norm_stderr\": 0.035014387062967806\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7222222222222222,\n \"acc_stderr\": 0.031911782267135466,\n \"\
acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.031911782267135466\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8186528497409327,\n \"acc_stderr\": 0.027807032360686088,\n\
\ \"acc_norm\": 0.8186528497409327,\n \"acc_norm_stderr\": 0.027807032360686088\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.558974358974359,\n \"acc_stderr\": 0.025174048384000745,\n \
\ \"acc_norm\": 0.558974358974359,\n \"acc_norm_stderr\": 0.025174048384000745\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34444444444444444,\n \"acc_stderr\": 0.028972648884844267,\n \
\ \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.028972648884844267\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.030778057422931673,\n\
\ \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.030778057422931673\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7467889908256881,\n \"acc_stderr\": 0.01864407304137504,\n \"\
acc_norm\": 0.7467889908256881,\n \"acc_norm_stderr\": 0.01864407304137504\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7843137254901961,\n\
\ \"acc_stderr\": 0.028867431449849313,\n \"acc_norm\": 0.7843137254901961,\n\
\ \"acc_norm_stderr\": 0.028867431449849313\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.7637130801687764,\n \"acc_stderr\": 0.02765215314415927,\n\
\ \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.02765215314415927\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n\
\ \"acc_stderr\": 0.030636591348699813,\n \"acc_norm\": 0.7040358744394619,\n\
\ \"acc_norm_stderr\": 0.030636591348699813\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6335877862595419,\n \"acc_stderr\": 0.04225875451969637,\n\
\ \"acc_norm\": 0.6335877862595419,\n \"acc_norm_stderr\": 0.04225875451969637\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\"\
: 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\
\ \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n\
\ \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6932515337423313,\n \"acc_stderr\": 0.03623089915724146,\n\
\ \"acc_norm\": 0.6932515337423313,\n \"acc_norm_stderr\": 0.03623089915724146\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04287858751340455,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04287858751340455\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6990291262135923,\n \"acc_stderr\": 0.04541609446503948,\n\
\ \"acc_norm\": 0.6990291262135923,\n \"acc_norm_stderr\": 0.04541609446503948\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.024414947304543678,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.024414947304543678\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7726692209450831,\n\
\ \"acc_stderr\": 0.014987270640946002,\n \"acc_norm\": 0.7726692209450831,\n\
\ \"acc_norm_stderr\": 0.014987270640946002\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.025305258131879706,\n\
\ \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.025305258131879706\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4312849162011173,\n\
\ \"acc_stderr\": 0.016563829399047703,\n \"acc_norm\": 0.4312849162011173,\n\
\ \"acc_norm_stderr\": 0.016563829399047703\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.630718954248366,\n \"acc_stderr\": 0.02763417668960266,\n\
\ \"acc_norm\": 0.630718954248366,\n \"acc_norm_stderr\": 0.02763417668960266\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6752411575562701,\n\
\ \"acc_stderr\": 0.026596782287697043,\n \"acc_norm\": 0.6752411575562701,\n\
\ \"acc_norm_stderr\": 0.026596782287697043\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6851851851851852,\n \"acc_stderr\": 0.025842248700902168,\n\
\ \"acc_norm\": 0.6851851851851852,\n \"acc_norm_stderr\": 0.025842248700902168\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4574468085106383,\n \"acc_stderr\": 0.02971928127223685,\n \
\ \"acc_norm\": 0.4574468085106383,\n \"acc_norm_stderr\": 0.02971928127223685\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45697522816166886,\n\
\ \"acc_stderr\": 0.012722869501611419,\n \"acc_norm\": 0.45697522816166886,\n\
\ \"acc_norm_stderr\": 0.012722869501611419\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5661764705882353,\n \"acc_stderr\": 0.030105636570016633,\n\
\ \"acc_norm\": 0.5661764705882353,\n \"acc_norm_stderr\": 0.030105636570016633\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6241830065359477,\n \"acc_stderr\": 0.01959402113657744,\n \
\ \"acc_norm\": 0.6241830065359477,\n \"acc_norm_stderr\": 0.01959402113657744\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6090909090909091,\n\
\ \"acc_stderr\": 0.04673752333670239,\n \"acc_norm\": 0.6090909090909091,\n\
\ \"acc_norm_stderr\": 0.04673752333670239\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6489795918367347,\n \"acc_stderr\": 0.03055531675557364,\n\
\ \"acc_norm\": 0.6489795918367347,\n \"acc_norm_stderr\": 0.03055531675557364\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.746268656716418,\n\
\ \"acc_stderr\": 0.030769444967296018,\n \"acc_norm\": 0.746268656716418,\n\
\ \"acc_norm_stderr\": 0.030769444967296018\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896309,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896309\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.43373493975903615,\n\
\ \"acc_stderr\": 0.03858158940685517,\n \"acc_norm\": 0.43373493975903615,\n\
\ \"acc_norm_stderr\": 0.03858158940685517\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7543859649122807,\n \"acc_stderr\": 0.03301405946987249,\n\
\ \"acc_norm\": 0.7543859649122807,\n \"acc_norm_stderr\": 0.03301405946987249\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3733170134638923,\n\
\ \"mc1_stderr\": 0.01693237055757063,\n \"mc2\": 0.5376513807870049,\n\
\ \"mc2_stderr\": 0.015911441950703046\n }\n}\n```"
repo_url: https://huggingface.co/Undi95/MLewd-ReMM-L2-Chat-20B-Inverted
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_08T21_13_04.392733
path:
- '**/details_harness|arc:challenge|25_2023-10-08T21-13-04.392733.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-08T21-13-04.392733.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_08T21_13_04.392733
path:
- '**/details_harness|hellaswag|10_2023-10-08T21-13-04.392733.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-08T21-13-04.392733.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_08T21_13_04.392733
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T21-13-04.392733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T21-13-04.392733.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T21-13-04.392733.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_08T21_13_04.392733
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T21-13-04.392733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T21-13-04.392733.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_08T21_13_04.392733
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T21-13-04.392733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T21-13-04.392733.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_08T21_13_04.392733
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T21-13-04.392733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T21-13-04.392733.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_08T21_13_04.392733
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T21-13-04.392733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T21-13-04.392733.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_08T21_13_04.392733
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T21-13-04.392733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T21-13-04.392733.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_08T21_13_04.392733
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T21-13-04.392733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T21-13-04.392733.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_08T21_13_04.392733
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T21-13-04.392733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T21-13-04.392733.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_08T21_13_04.392733
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T21-13-04.392733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T21-13-04.392733.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_08T21_13_04.392733
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T21-13-04.392733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T21-13-04.392733.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_08T21_13_04.392733
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T21-13-04.392733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T21-13-04.392733.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_08T21_13_04.392733
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T21-13-04.392733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T21-13-04.392733.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_08T21_13_04.392733
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T21-13-04.392733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T21-13-04.392733.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_08T21_13_04.392733
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T21-13-04.392733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T21-13-04.392733.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_08T21_13_04.392733
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T21-13-04.392733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T21-13-04.392733.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_08T21_13_04.392733
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T21-13-04.392733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T21-13-04.392733.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_08T21_13_04.392733
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T21-13-04.392733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T21-13-04.392733.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_08T21_13_04.392733
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T21-13-04.392733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T21-13-04.392733.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_08T21_13_04.392733
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T21-13-04.392733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T21-13-04.392733.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_08T21_13_04.392733
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T21-13-04.392733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T21-13-04.392733.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_08T21_13_04.392733
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T21-13-04.392733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T21-13-04.392733.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_08T21_13_04.392733
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T21-13-04.392733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T21-13-04.392733.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_08T21_13_04.392733
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T21-13-04.392733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T21-13-04.392733.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_08T21_13_04.392733
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T21-13-04.392733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T21-13-04.392733.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_08T21_13_04.392733
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T21-13-04.392733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T21-13-04.392733.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_08T21_13_04.392733
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T21-13-04.392733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T21-13-04.392733.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_08T21_13_04.392733
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T21-13-04.392733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T21-13-04.392733.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_08T21_13_04.392733
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T21-13-04.392733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T21-13-04.392733.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_08T21_13_04.392733
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T21-13-04.392733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T21-13-04.392733.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_08T21_13_04.392733
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T21-13-04.392733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T21-13-04.392733.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_08T21_13_04.392733
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T21-13-04.392733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T21-13-04.392733.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_08T21_13_04.392733
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T21-13-04.392733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T21-13-04.392733.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_08T21_13_04.392733
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T21-13-04.392733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T21-13-04.392733.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_08T21_13_04.392733
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T21-13-04.392733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T21-13-04.392733.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_08T21_13_04.392733
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T21-13-04.392733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T21-13-04.392733.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_08T21_13_04.392733
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T21-13-04.392733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T21-13-04.392733.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_08T21_13_04.392733
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T21-13-04.392733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T21-13-04.392733.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_08T21_13_04.392733
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T21-13-04.392733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T21-13-04.392733.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_08T21_13_04.392733
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T21-13-04.392733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T21-13-04.392733.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_08T21_13_04.392733
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-08T21-13-04.392733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-08T21-13-04.392733.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_08T21_13_04.392733
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T21-13-04.392733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T21-13-04.392733.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_08T21_13_04.392733
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T21-13-04.392733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T21-13-04.392733.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_08T21_13_04.392733
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T21-13-04.392733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T21-13-04.392733.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_08T21_13_04.392733
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T21-13-04.392733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T21-13-04.392733.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_08T21_13_04.392733
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T21-13-04.392733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T21-13-04.392733.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_08T21_13_04.392733
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T21-13-04.392733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T21-13-04.392733.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_08T21_13_04.392733
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T21-13-04.392733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T21-13-04.392733.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_08T21_13_04.392733
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T21-13-04.392733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T21-13-04.392733.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_08T21_13_04.392733
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T21-13-04.392733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T21-13-04.392733.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_08T21_13_04.392733
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T21-13-04.392733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T21-13-04.392733.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_08T21_13_04.392733
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T21-13-04.392733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T21-13-04.392733.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_08T21_13_04.392733
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T21-13-04.392733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T21-13-04.392733.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_08T21_13_04.392733
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T21-13-04.392733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T21-13-04.392733.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_08T21_13_04.392733
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T21-13-04.392733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T21-13-04.392733.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_08T21_13_04.392733
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T21-13-04.392733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T21-13-04.392733.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_08T21_13_04.392733
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T21-13-04.392733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T21-13-04.392733.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_08T21_13_04.392733
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T21-13-04.392733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T21-13-04.392733.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_08T21_13_04.392733
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T21-13-04.392733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T21-13-04.392733.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_08T21_13_04.392733
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-08T21-13-04.392733.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-08T21-13-04.392733.parquet'
- config_name: results
data_files:
- split: 2023_10_08T21_13_04.392733
path:
- results_2023-10-08T21-13-04.392733.parquet
- split: latest
path:
- results_2023-10-08T21-13-04.392733.parquet
---
# Dataset Card for Evaluation run of Undi95/MLewd-ReMM-L2-Chat-20B-Inverted
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Undi95/MLewd-ReMM-L2-Chat-20B-Inverted
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Undi95/MLewd-ReMM-L2-Chat-20B-Inverted](https://huggingface.co/Undi95/MLewd-ReMM-L2-Chat-20B-Inverted) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Undi95__MLewd-ReMM-L2-Chat-20B-Inverted",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-08T21:13:04.392733](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__MLewd-ReMM-L2-Chat-20B-Inverted/blob/main/results_2023-10-08T21-13-04.392733.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5816164336009417,
"acc_stderr": 0.03420587695783267,
"acc_norm": 0.5852291914116693,
"acc_norm_stderr": 0.034183268720349165,
"mc1": 0.3733170134638923,
"mc1_stderr": 0.01693237055757063,
"mc2": 0.5376513807870049,
"mc2_stderr": 0.015911441950703046
},
"harness|arc:challenge|25": {
"acc": 0.5921501706484642,
"acc_stderr": 0.014361097288449693,
"acc_norm": 0.6168941979522184,
"acc_norm_stderr": 0.014206472661672877
},
"harness|hellaswag|10": {
"acc": 0.6648078072097192,
"acc_stderr": 0.004710928569985762,
"acc_norm": 0.8532164907388966,
"acc_norm_stderr": 0.003531667185235823
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4740740740740741,
"acc_stderr": 0.04313531696750574,
"acc_norm": 0.4740740740740741,
"acc_norm_stderr": 0.04313531696750574
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5394736842105263,
"acc_stderr": 0.04056242252249033,
"acc_norm": 0.5394736842105263,
"acc_norm_stderr": 0.04056242252249033
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6188679245283019,
"acc_stderr": 0.029890609686286644,
"acc_norm": 0.6188679245283019,
"acc_norm_stderr": 0.029890609686286644
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6736111111111112,
"acc_stderr": 0.03921067198982266,
"acc_norm": 0.6736111111111112,
"acc_norm_stderr": 0.03921067198982266
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5549132947976878,
"acc_stderr": 0.03789401760283647,
"acc_norm": 0.5549132947976878,
"acc_norm_stderr": 0.03789401760283647
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04690650298201942,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04690650298201942
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.49361702127659574,
"acc_stderr": 0.032683358999363366,
"acc_norm": 0.49361702127659574,
"acc_norm_stderr": 0.032683358999363366
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.04049339297748141,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.04049339297748141
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.04164188720169375,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.04164188720169375
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.335978835978836,
"acc_stderr": 0.02432631052914915,
"acc_norm": 0.335978835978836,
"acc_norm_stderr": 0.02432631052914915
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768177,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768177
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6645161290322581,
"acc_stderr": 0.02686020644472435,
"acc_norm": 0.6645161290322581,
"acc_norm_stderr": 0.02686020644472435
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7212121212121212,
"acc_stderr": 0.035014387062967806,
"acc_norm": 0.7212121212121212,
"acc_norm_stderr": 0.035014387062967806
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.031911782267135466,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.031911782267135466
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8186528497409327,
"acc_stderr": 0.027807032360686088,
"acc_norm": 0.8186528497409327,
"acc_norm_stderr": 0.027807032360686088
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.558974358974359,
"acc_stderr": 0.025174048384000745,
"acc_norm": 0.558974358974359,
"acc_norm_stderr": 0.025174048384000745
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.028972648884844267,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.028972648884844267
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6596638655462185,
"acc_stderr": 0.030778057422931673,
"acc_norm": 0.6596638655462185,
"acc_norm_stderr": 0.030778057422931673
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.03958027231121569,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.03958027231121569
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7467889908256881,
"acc_stderr": 0.01864407304137504,
"acc_norm": 0.7467889908256881,
"acc_norm_stderr": 0.01864407304137504
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7843137254901961,
"acc_stderr": 0.028867431449849313,
"acc_norm": 0.7843137254901961,
"acc_norm_stderr": 0.028867431449849313
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7637130801687764,
"acc_stderr": 0.02765215314415927,
"acc_norm": 0.7637130801687764,
"acc_norm_stderr": 0.02765215314415927
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7040358744394619,
"acc_stderr": 0.030636591348699813,
"acc_norm": 0.7040358744394619,
"acc_norm_stderr": 0.030636591348699813
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6335877862595419,
"acc_stderr": 0.04225875451969637,
"acc_norm": 0.6335877862595419,
"acc_norm_stderr": 0.04225875451969637
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302872,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302872
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6932515337423313,
"acc_stderr": 0.03623089915724146,
"acc_norm": 0.6932515337423313,
"acc_norm_stderr": 0.03623089915724146
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04287858751340455,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04287858751340455
},
"harness|hendrycksTest-management|5": {
"acc": 0.6990291262135923,
"acc_stderr": 0.04541609446503948,
"acc_norm": 0.6990291262135923,
"acc_norm_stderr": 0.04541609446503948
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.024414947304543678,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.024414947304543678
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7726692209450831,
"acc_stderr": 0.014987270640946002,
"acc_norm": 0.7726692209450831,
"acc_norm_stderr": 0.014987270640946002
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.025305258131879706,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.025305258131879706
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4312849162011173,
"acc_stderr": 0.016563829399047703,
"acc_norm": 0.4312849162011173,
"acc_norm_stderr": 0.016563829399047703
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.630718954248366,
"acc_stderr": 0.02763417668960266,
"acc_norm": 0.630718954248366,
"acc_norm_stderr": 0.02763417668960266
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6752411575562701,
"acc_stderr": 0.026596782287697043,
"acc_norm": 0.6752411575562701,
"acc_norm_stderr": 0.026596782287697043
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.025842248700902168,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.025842248700902168
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4574468085106383,
"acc_stderr": 0.02971928127223685,
"acc_norm": 0.4574468085106383,
"acc_norm_stderr": 0.02971928127223685
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45697522816166886,
"acc_stderr": 0.012722869501611419,
"acc_norm": 0.45697522816166886,
"acc_norm_stderr": 0.012722869501611419
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5661764705882353,
"acc_stderr": 0.030105636570016633,
"acc_norm": 0.5661764705882353,
"acc_norm_stderr": 0.030105636570016633
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6241830065359477,
"acc_stderr": 0.01959402113657744,
"acc_norm": 0.6241830065359477,
"acc_norm_stderr": 0.01959402113657744
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6090909090909091,
"acc_stderr": 0.04673752333670239,
"acc_norm": 0.6090909090909091,
"acc_norm_stderr": 0.04673752333670239
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6489795918367347,
"acc_stderr": 0.03055531675557364,
"acc_norm": 0.6489795918367347,
"acc_norm_stderr": 0.03055531675557364
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.746268656716418,
"acc_stderr": 0.030769444967296018,
"acc_norm": 0.746268656716418,
"acc_norm_stderr": 0.030769444967296018
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896309,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896309
},
"harness|hendrycksTest-virology|5": {
"acc": 0.43373493975903615,
"acc_stderr": 0.03858158940685517,
"acc_norm": 0.43373493975903615,
"acc_norm_stderr": 0.03858158940685517
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7543859649122807,
"acc_stderr": 0.03301405946987249,
"acc_norm": 0.7543859649122807,
"acc_norm_stderr": 0.03301405946987249
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3733170134638923,
"mc1_stderr": 0.01693237055757063,
"mc2": 0.5376513807870049,
"mc2_stderr": 0.015911441950703046
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_pankajmathur__Lima_Unchained_70b | 2023-10-08T21:19:36.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of pankajmathur/Lima_Unchained_70b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [pankajmathur/Lima_Unchained_70b](https://huggingface.co/pankajmathur/Lima_Unchained_70b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_pankajmathur__Lima_Unchained_70b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-08T21:18:19.268295](https://huggingface.co/datasets/open-llm-leaderboard/details_pankajmathur__Lima_Unchained_70b/blob/main/results_2023-10-08T21-18-19.268295.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6986010740840336,\n\
\ \"acc_stderr\": 0.03092585532749937,\n \"acc_norm\": 0.7027436596061677,\n\
\ \"acc_norm_stderr\": 0.030894597930437574,\n \"mc1\": 0.34516523867809057,\n\
\ \"mc1_stderr\": 0.01664310331927494,\n \"mc2\": 0.48762211996558275,\n\
\ \"mc2_stderr\": 0.014357257889118747\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6339590443686007,\n \"acc_stderr\": 0.014077223108470139,\n\
\ \"acc_norm\": 0.6825938566552902,\n \"acc_norm_stderr\": 0.013602239088038167\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.680740888269269,\n\
\ \"acc_stderr\": 0.00465236827384552,\n \"acc_norm\": 0.8765186217884884,\n\
\ \"acc_norm_stderr\": 0.00328316586763137\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7960526315789473,\n \"acc_stderr\": 0.032790004063100495,\n\
\ \"acc_norm\": 0.7960526315789473,\n \"acc_norm_stderr\": 0.032790004063100495\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.02783491252754407,\n\
\ \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.02783491252754407\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8263888888888888,\n\
\ \"acc_stderr\": 0.031674733837957166,\n \"acc_norm\": 0.8263888888888888,\n\
\ \"acc_norm_stderr\": 0.031674733837957166\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n\
\ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n\
\ \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n\
\ \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.04755129616062946,\n\
\ \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.04755129616062946\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.8,\n\
\ \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6808510638297872,\n \"acc_stderr\": 0.03047297336338003,\n\
\ \"acc_norm\": 0.6808510638297872,\n \"acc_norm_stderr\": 0.03047297336338003\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n\
\ \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n\
\ \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6413793103448275,\n \"acc_stderr\": 0.039966295748767186,\n\
\ \"acc_norm\": 0.6413793103448275,\n \"acc_norm_stderr\": 0.039966295748767186\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4312169312169312,\n \"acc_stderr\": 0.0255064816981382,\n \"acc_norm\"\
: 0.4312169312169312,\n \"acc_norm_stderr\": 0.0255064816981382\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677173,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677173\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8193548387096774,\n\
\ \"acc_stderr\": 0.02188617856717252,\n \"acc_norm\": 0.8193548387096774,\n\
\ \"acc_norm_stderr\": 0.02188617856717252\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5369458128078818,\n \"acc_stderr\": 0.035083705204426656,\n\
\ \"acc_norm\": 0.5369458128078818,\n \"acc_norm_stderr\": 0.035083705204426656\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.78,\n \"acc_stderr\": 0.041633319989322626,\n \"acc_norm\"\
: 0.78,\n \"acc_norm_stderr\": 0.041633319989322626\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8484848484848485,\n \"acc_stderr\": 0.027998073798781678,\n\
\ \"acc_norm\": 0.8484848484848485,\n \"acc_norm_stderr\": 0.027998073798781678\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8737373737373737,\n \"acc_stderr\": 0.023664359402880232,\n \"\
acc_norm\": 0.8737373737373737,\n \"acc_norm_stderr\": 0.023664359402880232\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9378238341968912,\n \"acc_stderr\": 0.01742697415424052,\n\
\ \"acc_norm\": 0.9378238341968912,\n \"acc_norm_stderr\": 0.01742697415424052\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7205128205128205,\n \"acc_stderr\": 0.022752388839776823,\n\
\ \"acc_norm\": 0.7205128205128205,\n \"acc_norm_stderr\": 0.022752388839776823\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35555555555555557,\n \"acc_stderr\": 0.02918571494985741,\n \
\ \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.02918571494985741\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.02755361446786381,\n \
\ \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.02755361446786381\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.44370860927152317,\n \"acc_stderr\": 0.04056527902281732,\n \"\
acc_norm\": 0.44370860927152317,\n \"acc_norm_stderr\": 0.04056527902281732\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8715596330275229,\n \"acc_stderr\": 0.014344977542914318,\n \"\
acc_norm\": 0.8715596330275229,\n \"acc_norm_stderr\": 0.014344977542914318\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6435185185185185,\n \"acc_stderr\": 0.032664783315272714,\n \"\
acc_norm\": 0.6435185185185185,\n \"acc_norm_stderr\": 0.032664783315272714\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8921568627450981,\n \"acc_stderr\": 0.021770522281368398,\n \"\
acc_norm\": 0.8921568627450981,\n \"acc_norm_stderr\": 0.021770522281368398\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8776371308016878,\n \"acc_stderr\": 0.021331741829746786,\n \
\ \"acc_norm\": 0.8776371308016878,\n \"acc_norm_stderr\": 0.021331741829746786\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8026905829596412,\n\
\ \"acc_stderr\": 0.02670985334496796,\n \"acc_norm\": 0.8026905829596412,\n\
\ \"acc_norm_stderr\": 0.02670985334496796\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8778625954198473,\n \"acc_stderr\": 0.028718776889342344,\n\
\ \"acc_norm\": 0.8778625954198473,\n \"acc_norm_stderr\": 0.028718776889342344\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8760330578512396,\n \"acc_stderr\": 0.03008309871603521,\n \"\
acc_norm\": 0.8760330578512396,\n \"acc_norm_stderr\": 0.03008309871603521\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8240740740740741,\n\
\ \"acc_stderr\": 0.036809181416738807,\n \"acc_norm\": 0.8240740740740741,\n\
\ \"acc_norm_stderr\": 0.036809181416738807\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8220858895705522,\n \"acc_stderr\": 0.03004735765580663,\n\
\ \"acc_norm\": 0.8220858895705522,\n \"acc_norm_stderr\": 0.03004735765580663\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.5178571428571429,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8974358974358975,\n\
\ \"acc_stderr\": 0.019875655027867447,\n \"acc_norm\": 0.8974358974358975,\n\
\ \"acc_norm_stderr\": 0.019875655027867447\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768077,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768077\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8607918263090677,\n\
\ \"acc_stderr\": 0.012378786101885154,\n \"acc_norm\": 0.8607918263090677,\n\
\ \"acc_norm_stderr\": 0.012378786101885154\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7947976878612717,\n \"acc_stderr\": 0.021742519835276274,\n\
\ \"acc_norm\": 0.7947976878612717,\n \"acc_norm_stderr\": 0.021742519835276274\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.48156424581005586,\n\
\ \"acc_stderr\": 0.01671113049778282,\n \"acc_norm\": 0.48156424581005586,\n\
\ \"acc_norm_stderr\": 0.01671113049778282\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.024288619466046112,\n\
\ \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.024288619466046112\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7813504823151125,\n\
\ \"acc_stderr\": 0.02347558141786111,\n \"acc_norm\": 0.7813504823151125,\n\
\ \"acc_norm_stderr\": 0.02347558141786111\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8487654320987654,\n \"acc_stderr\": 0.01993508609214989,\n\
\ \"acc_norm\": 0.8487654320987654,\n \"acc_norm_stderr\": 0.01993508609214989\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5638297872340425,\n \"acc_stderr\": 0.029583452036284076,\n \
\ \"acc_norm\": 0.5638297872340425,\n \"acc_norm_stderr\": 0.029583452036284076\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5365058670143416,\n\
\ \"acc_stderr\": 0.012736153390214966,\n \"acc_norm\": 0.5365058670143416,\n\
\ \"acc_norm_stderr\": 0.012736153390214966\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7426470588235294,\n \"acc_stderr\": 0.026556519470041503,\n\
\ \"acc_norm\": 0.7426470588235294,\n \"acc_norm_stderr\": 0.026556519470041503\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.75,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.75,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.04389311454644287,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.04389311454644287\n },\n\
\ \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7877551020408163,\n\
\ \"acc_stderr\": 0.026176967197866764,\n \"acc_norm\": 0.7877551020408163,\n\
\ \"acc_norm_stderr\": 0.026176967197866764\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.8805970149253731,\n \"acc_stderr\": 0.02292879327721974,\n\
\ \"acc_norm\": 0.8805970149253731,\n \"acc_norm_stderr\": 0.02292879327721974\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.92,\n \"acc_stderr\": 0.0272659924344291,\n \"acc_norm\": 0.92,\n\
\ \"acc_norm_stderr\": 0.0272659924344291\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.536144578313253,\n \"acc_stderr\": 0.03882310850890594,\n\
\ \"acc_norm\": 0.536144578313253,\n \"acc_norm_stderr\": 0.03882310850890594\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8596491228070176,\n\
\ \"acc_stderr\": 0.0266405825391332,\n \"acc_norm\": 0.8596491228070176,\n\
\ \"acc_norm_stderr\": 0.0266405825391332\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.34516523867809057,\n \"mc1_stderr\": 0.01664310331927494,\n\
\ \"mc2\": 0.48762211996558275,\n \"mc2_stderr\": 0.014357257889118747\n\
\ }\n}\n```"
repo_url: https://huggingface.co/pankajmathur/Lima_Unchained_70b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_08T21_18_19.268295
path:
- '**/details_harness|arc:challenge|25_2023-10-08T21-18-19.268295.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-08T21-18-19.268295.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_08T21_18_19.268295
path:
- '**/details_harness|hellaswag|10_2023-10-08T21-18-19.268295.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-08T21-18-19.268295.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_08T21_18_19.268295
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T21-18-19.268295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T21-18-19.268295.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T21-18-19.268295.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_08T21_18_19.268295
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T21-18-19.268295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T21-18-19.268295.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_08T21_18_19.268295
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T21-18-19.268295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T21-18-19.268295.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_08T21_18_19.268295
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T21-18-19.268295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T21-18-19.268295.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_08T21_18_19.268295
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T21-18-19.268295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T21-18-19.268295.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_08T21_18_19.268295
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T21-18-19.268295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T21-18-19.268295.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_08T21_18_19.268295
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T21-18-19.268295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T21-18-19.268295.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_08T21_18_19.268295
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T21-18-19.268295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T21-18-19.268295.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_08T21_18_19.268295
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T21-18-19.268295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T21-18-19.268295.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_08T21_18_19.268295
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T21-18-19.268295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T21-18-19.268295.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_08T21_18_19.268295
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T21-18-19.268295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T21-18-19.268295.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_08T21_18_19.268295
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T21-18-19.268295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T21-18-19.268295.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_08T21_18_19.268295
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T21-18-19.268295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T21-18-19.268295.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_08T21_18_19.268295
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T21-18-19.268295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T21-18-19.268295.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_08T21_18_19.268295
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T21-18-19.268295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T21-18-19.268295.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_08T21_18_19.268295
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T21-18-19.268295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T21-18-19.268295.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_08T21_18_19.268295
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T21-18-19.268295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T21-18-19.268295.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_08T21_18_19.268295
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T21-18-19.268295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T21-18-19.268295.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_08T21_18_19.268295
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T21-18-19.268295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T21-18-19.268295.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_08T21_18_19.268295
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T21-18-19.268295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T21-18-19.268295.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_08T21_18_19.268295
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T21-18-19.268295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T21-18-19.268295.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_08T21_18_19.268295
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T21-18-19.268295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T21-18-19.268295.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_08T21_18_19.268295
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T21-18-19.268295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T21-18-19.268295.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_08T21_18_19.268295
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T21-18-19.268295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T21-18-19.268295.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_08T21_18_19.268295
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T21-18-19.268295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T21-18-19.268295.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_08T21_18_19.268295
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T21-18-19.268295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T21-18-19.268295.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_08T21_18_19.268295
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T21-18-19.268295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T21-18-19.268295.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_08T21_18_19.268295
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T21-18-19.268295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T21-18-19.268295.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_08T21_18_19.268295
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T21-18-19.268295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T21-18-19.268295.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_08T21_18_19.268295
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T21-18-19.268295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T21-18-19.268295.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_08T21_18_19.268295
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T21-18-19.268295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T21-18-19.268295.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_08T21_18_19.268295
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T21-18-19.268295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T21-18-19.268295.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_08T21_18_19.268295
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T21-18-19.268295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T21-18-19.268295.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_08T21_18_19.268295
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T21-18-19.268295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T21-18-19.268295.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_08T21_18_19.268295
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T21-18-19.268295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T21-18-19.268295.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_08T21_18_19.268295
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T21-18-19.268295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T21-18-19.268295.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_08T21_18_19.268295
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T21-18-19.268295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T21-18-19.268295.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_08T21_18_19.268295
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T21-18-19.268295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T21-18-19.268295.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_08T21_18_19.268295
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T21-18-19.268295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T21-18-19.268295.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_08T21_18_19.268295
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-08T21-18-19.268295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-08T21-18-19.268295.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_08T21_18_19.268295
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T21-18-19.268295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T21-18-19.268295.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_08T21_18_19.268295
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T21-18-19.268295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T21-18-19.268295.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_08T21_18_19.268295
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T21-18-19.268295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T21-18-19.268295.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_08T21_18_19.268295
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T21-18-19.268295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T21-18-19.268295.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_08T21_18_19.268295
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T21-18-19.268295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T21-18-19.268295.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_08T21_18_19.268295
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T21-18-19.268295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T21-18-19.268295.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_08T21_18_19.268295
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T21-18-19.268295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T21-18-19.268295.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_08T21_18_19.268295
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T21-18-19.268295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T21-18-19.268295.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_08T21_18_19.268295
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T21-18-19.268295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T21-18-19.268295.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_08T21_18_19.268295
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T21-18-19.268295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T21-18-19.268295.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_08T21_18_19.268295
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T21-18-19.268295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T21-18-19.268295.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_08T21_18_19.268295
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T21-18-19.268295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T21-18-19.268295.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_08T21_18_19.268295
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T21-18-19.268295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T21-18-19.268295.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_08T21_18_19.268295
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T21-18-19.268295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T21-18-19.268295.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_08T21_18_19.268295
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T21-18-19.268295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T21-18-19.268295.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_08T21_18_19.268295
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T21-18-19.268295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T21-18-19.268295.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_08T21_18_19.268295
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T21-18-19.268295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T21-18-19.268295.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_08T21_18_19.268295
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T21-18-19.268295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T21-18-19.268295.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_08T21_18_19.268295
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-08T21-18-19.268295.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-08T21-18-19.268295.parquet'
- config_name: results
data_files:
- split: 2023_10_08T21_18_19.268295
path:
- results_2023-10-08T21-18-19.268295.parquet
- split: latest
path:
- results_2023-10-08T21-18-19.268295.parquet
---
# Dataset Card for Evaluation run of pankajmathur/Lima_Unchained_70b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/pankajmathur/Lima_Unchained_70b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [pankajmathur/Lima_Unchained_70b](https://huggingface.co/pankajmathur/Lima_Unchained_70b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_pankajmathur__Lima_Unchained_70b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-08T21:18:19.268295](https://huggingface.co/datasets/open-llm-leaderboard/details_pankajmathur__Lima_Unchained_70b/blob/main/results_2023-10-08T21-18-19.268295.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6986010740840336,
"acc_stderr": 0.03092585532749937,
"acc_norm": 0.7027436596061677,
"acc_norm_stderr": 0.030894597930437574,
"mc1": 0.34516523867809057,
"mc1_stderr": 0.01664310331927494,
"mc2": 0.48762211996558275,
"mc2_stderr": 0.014357257889118747
},
"harness|arc:challenge|25": {
"acc": 0.6339590443686007,
"acc_stderr": 0.014077223108470139,
"acc_norm": 0.6825938566552902,
"acc_norm_stderr": 0.013602239088038167
},
"harness|hellaswag|10": {
"acc": 0.680740888269269,
"acc_stderr": 0.00465236827384552,
"acc_norm": 0.8765186217884884,
"acc_norm_stderr": 0.00328316586763137
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7960526315789473,
"acc_stderr": 0.032790004063100495,
"acc_norm": 0.7960526315789473,
"acc_norm_stderr": 0.032790004063100495
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.02783491252754407,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.02783491252754407
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8263888888888888,
"acc_stderr": 0.031674733837957166,
"acc_norm": 0.8263888888888888,
"acc_norm_stderr": 0.031674733837957166
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.04755129616062946,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.04755129616062946
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6808510638297872,
"acc_stderr": 0.03047297336338003,
"acc_norm": 0.6808510638297872,
"acc_norm_stderr": 0.03047297336338003
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6413793103448275,
"acc_stderr": 0.039966295748767186,
"acc_norm": 0.6413793103448275,
"acc_norm_stderr": 0.039966295748767186
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4312169312169312,
"acc_stderr": 0.0255064816981382,
"acc_norm": 0.4312169312169312,
"acc_norm_stderr": 0.0255064816981382
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677173,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677173
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8193548387096774,
"acc_stderr": 0.02188617856717252,
"acc_norm": 0.8193548387096774,
"acc_norm_stderr": 0.02188617856717252
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5369458128078818,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.5369458128078818,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.78,
"acc_stderr": 0.041633319989322626,
"acc_norm": 0.78,
"acc_norm_stderr": 0.041633319989322626
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8484848484848485,
"acc_stderr": 0.027998073798781678,
"acc_norm": 0.8484848484848485,
"acc_norm_stderr": 0.027998073798781678
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8737373737373737,
"acc_stderr": 0.023664359402880232,
"acc_norm": 0.8737373737373737,
"acc_norm_stderr": 0.023664359402880232
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9378238341968912,
"acc_stderr": 0.01742697415424052,
"acc_norm": 0.9378238341968912,
"acc_norm_stderr": 0.01742697415424052
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7205128205128205,
"acc_stderr": 0.022752388839776823,
"acc_norm": 0.7205128205128205,
"acc_norm_stderr": 0.022752388839776823
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.02918571494985741,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.02918571494985741
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.02755361446786381,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.02755361446786381
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.44370860927152317,
"acc_stderr": 0.04056527902281732,
"acc_norm": 0.44370860927152317,
"acc_norm_stderr": 0.04056527902281732
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8715596330275229,
"acc_stderr": 0.014344977542914318,
"acc_norm": 0.8715596330275229,
"acc_norm_stderr": 0.014344977542914318
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6435185185185185,
"acc_stderr": 0.032664783315272714,
"acc_norm": 0.6435185185185185,
"acc_norm_stderr": 0.032664783315272714
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8921568627450981,
"acc_stderr": 0.021770522281368398,
"acc_norm": 0.8921568627450981,
"acc_norm_stderr": 0.021770522281368398
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8776371308016878,
"acc_stderr": 0.021331741829746786,
"acc_norm": 0.8776371308016878,
"acc_norm_stderr": 0.021331741829746786
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8026905829596412,
"acc_stderr": 0.02670985334496796,
"acc_norm": 0.8026905829596412,
"acc_norm_stderr": 0.02670985334496796
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8778625954198473,
"acc_stderr": 0.028718776889342344,
"acc_norm": 0.8778625954198473,
"acc_norm_stderr": 0.028718776889342344
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8760330578512396,
"acc_stderr": 0.03008309871603521,
"acc_norm": 0.8760330578512396,
"acc_norm_stderr": 0.03008309871603521
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8240740740740741,
"acc_stderr": 0.036809181416738807,
"acc_norm": 0.8240740740740741,
"acc_norm_stderr": 0.036809181416738807
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8220858895705522,
"acc_stderr": 0.03004735765580663,
"acc_norm": 0.8220858895705522,
"acc_norm_stderr": 0.03004735765580663
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5178571428571429,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.5178571428571429,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8974358974358975,
"acc_stderr": 0.019875655027867447,
"acc_norm": 0.8974358974358975,
"acc_norm_stderr": 0.019875655027867447
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768077,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768077
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8607918263090677,
"acc_stderr": 0.012378786101885154,
"acc_norm": 0.8607918263090677,
"acc_norm_stderr": 0.012378786101885154
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7947976878612717,
"acc_stderr": 0.021742519835276274,
"acc_norm": 0.7947976878612717,
"acc_norm_stderr": 0.021742519835276274
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.48156424581005586,
"acc_stderr": 0.01671113049778282,
"acc_norm": 0.48156424581005586,
"acc_norm_stderr": 0.01671113049778282
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.024288619466046112,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.024288619466046112
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7813504823151125,
"acc_stderr": 0.02347558141786111,
"acc_norm": 0.7813504823151125,
"acc_norm_stderr": 0.02347558141786111
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8487654320987654,
"acc_stderr": 0.01993508609214989,
"acc_norm": 0.8487654320987654,
"acc_norm_stderr": 0.01993508609214989
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5638297872340425,
"acc_stderr": 0.029583452036284076,
"acc_norm": 0.5638297872340425,
"acc_norm_stderr": 0.029583452036284076
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5365058670143416,
"acc_stderr": 0.012736153390214966,
"acc_norm": 0.5365058670143416,
"acc_norm_stderr": 0.012736153390214966
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7426470588235294,
"acc_stderr": 0.026556519470041503,
"acc_norm": 0.7426470588235294,
"acc_norm_stderr": 0.026556519470041503
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.75,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.75,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7877551020408163,
"acc_stderr": 0.026176967197866764,
"acc_norm": 0.7877551020408163,
"acc_norm_stderr": 0.026176967197866764
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8805970149253731,
"acc_stderr": 0.02292879327721974,
"acc_norm": 0.8805970149253731,
"acc_norm_stderr": 0.02292879327721974
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.92,
"acc_stderr": 0.0272659924344291,
"acc_norm": 0.92,
"acc_norm_stderr": 0.0272659924344291
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.03882310850890594,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.03882310850890594
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8596491228070176,
"acc_stderr": 0.0266405825391332,
"acc_norm": 0.8596491228070176,
"acc_norm_stderr": 0.0266405825391332
},
"harness|truthfulqa:mc|0": {
"mc1": 0.34516523867809057,
"mc1_stderr": 0.01664310331927494,
"mc2": 0.48762211996558275,
"mc2_stderr": 0.014357257889118747
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
katryo/grvr-spl-1 | 2023-10-08T21:50:00.000Z | [
"region:us"
] | katryo | null | null | null | 0 | 0 | Entry not found |
Monkaro/1024x1024-Woman | 2023-10-08T21:59:06.000Z | [
"region:us"
] | Monkaro | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_uukuguy__speechless-codellama-34b-v2.0 | 2023-10-08T21:57:00.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of uukuguy/speechless-codellama-34b-v2.0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [uukuguy/speechless-codellama-34b-v2.0](https://huggingface.co/uukuguy/speechless-codellama-34b-v2.0)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_uukuguy__speechless-codellama-34b-v2.0\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-08T21:55:38.209151](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-codellama-34b-v2.0/blob/main/results_2023-10-08T21-55-38.209151.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5464126519732339,\n\
\ \"acc_stderr\": 0.03484059078525412,\n \"acc_norm\": 0.5502254949771422,\n\
\ \"acc_norm_stderr\": 0.034828419990997346,\n \"mc1\": 0.30354957160342716,\n\
\ \"mc1_stderr\": 0.016095884155386854,\n \"mc2\": 0.45211730267295,\n\
\ \"mc2_stderr\": 0.014761798027932871\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5119453924914675,\n \"acc_stderr\": 0.014607220340597171,\n\
\ \"acc_norm\": 0.5435153583617748,\n \"acc_norm_stderr\": 0.01455594976049644\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5631348336984664,\n\
\ \"acc_stderr\": 0.004949842967331427,\n \"acc_norm\": 0.7565226050587532,\n\
\ \"acc_norm_stderr\": 0.004283036686282379\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4148148148148148,\n\
\ \"acc_stderr\": 0.042561937679014075,\n \"acc_norm\": 0.4148148148148148,\n\
\ \"acc_norm_stderr\": 0.042561937679014075\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5723684210526315,\n \"acc_stderr\": 0.04026097083296562,\n\
\ \"acc_norm\": 0.5723684210526315,\n \"acc_norm_stderr\": 0.04026097083296562\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n\
\ \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.49056603773584906,\n \"acc_stderr\": 0.0307673947078081,\n\
\ \"acc_norm\": 0.49056603773584906,\n \"acc_norm_stderr\": 0.0307673947078081\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5625,\n\
\ \"acc_stderr\": 0.04148415739394154,\n \"acc_norm\": 0.5625,\n \
\ \"acc_norm_stderr\": 0.04148415739394154\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n\
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411019,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411019\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4393063583815029,\n\
\ \"acc_stderr\": 0.037842719328874674,\n \"acc_norm\": 0.4393063583815029,\n\
\ \"acc_norm_stderr\": 0.037842719328874674\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.044405219061793275,\n\
\ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.044405219061793275\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.68,\n\
\ \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.48936170212765956,\n \"acc_stderr\": 0.03267862331014063,\n\
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.03267862331014063\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.38596491228070173,\n\
\ \"acc_stderr\": 0.04579639422070434,\n \"acc_norm\": 0.38596491228070173,\n\
\ \"acc_norm_stderr\": 0.04579639422070434\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728763,\n\
\ \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728763\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42063492063492064,\n \"acc_stderr\": 0.025424835086923992,\n \"\
acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.025424835086923992\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n\
\ \"acc_stderr\": 0.04415438226743743,\n \"acc_norm\": 0.42063492063492064,\n\
\ \"acc_norm_stderr\": 0.04415438226743743\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6419354838709678,\n \"acc_stderr\": 0.02727389059430064,\n \"\
acc_norm\": 0.6419354838709678,\n \"acc_norm_stderr\": 0.02727389059430064\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4088669950738916,\n \"acc_stderr\": 0.034590588158832314,\n \"\
acc_norm\": 0.4088669950738916,\n \"acc_norm_stderr\": 0.034590588158832314\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\"\
: 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7151515151515152,\n \"acc_stderr\": 0.03524390844511781,\n\
\ \"acc_norm\": 0.7151515151515152,\n \"acc_norm_stderr\": 0.03524390844511781\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.696969696969697,\n \"acc_stderr\": 0.032742879140268674,\n \"\
acc_norm\": 0.696969696969697,\n \"acc_norm_stderr\": 0.032742879140268674\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7357512953367875,\n \"acc_stderr\": 0.031821550509166456,\n\
\ \"acc_norm\": 0.7357512953367875,\n \"acc_norm_stderr\": 0.031821550509166456\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4948717948717949,\n \"acc_stderr\": 0.025349672906838653,\n\
\ \"acc_norm\": 0.4948717948717949,\n \"acc_norm_stderr\": 0.025349672906838653\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2814814814814815,\n \"acc_stderr\": 0.02742001935094527,\n \
\ \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.02742001935094527\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5504201680672269,\n \"acc_stderr\": 0.03231293497137707,\n \
\ \"acc_norm\": 0.5504201680672269,\n \"acc_norm_stderr\": 0.03231293497137707\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7339449541284404,\n \"acc_stderr\": 0.018946022322225604,\n \"\
acc_norm\": 0.7339449541284404,\n \"acc_norm_stderr\": 0.018946022322225604\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.44907407407407407,\n \"acc_stderr\": 0.03392238405321616,\n \"\
acc_norm\": 0.44907407407407407,\n \"acc_norm_stderr\": 0.03392238405321616\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7401960784313726,\n \"acc_stderr\": 0.03077855467869326,\n \"\
acc_norm\": 0.7401960784313726,\n \"acc_norm_stderr\": 0.03077855467869326\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.759493670886076,\n \"acc_stderr\": 0.02782078198114969,\n \
\ \"acc_norm\": 0.759493670886076,\n \"acc_norm_stderr\": 0.02782078198114969\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5695067264573991,\n\
\ \"acc_stderr\": 0.033231973029429394,\n \"acc_norm\": 0.5695067264573991,\n\
\ \"acc_norm_stderr\": 0.033231973029429394\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.4961832061068702,\n \"acc_stderr\": 0.043851623256015534,\n\
\ \"acc_norm\": 0.4961832061068702,\n \"acc_norm_stderr\": 0.043851623256015534\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7107438016528925,\n \"acc_stderr\": 0.04139112727635463,\n \"\
acc_norm\": 0.7107438016528925,\n \"acc_norm_stderr\": 0.04139112727635463\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6481481481481481,\n\
\ \"acc_stderr\": 0.04616631111801713,\n \"acc_norm\": 0.6481481481481481,\n\
\ \"acc_norm_stderr\": 0.04616631111801713\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6932515337423313,\n \"acc_stderr\": 0.036230899157241474,\n\
\ \"acc_norm\": 0.6932515337423313,\n \"acc_norm_stderr\": 0.036230899157241474\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7087378640776699,\n \"acc_stderr\": 0.04498676320572924,\n\
\ \"acc_norm\": 0.7087378640776699,\n \"acc_norm_stderr\": 0.04498676320572924\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7905982905982906,\n\
\ \"acc_stderr\": 0.026655699653922744,\n \"acc_norm\": 0.7905982905982906,\n\
\ \"acc_norm_stderr\": 0.026655699653922744\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7126436781609196,\n\
\ \"acc_stderr\": 0.0161824107306827,\n \"acc_norm\": 0.7126436781609196,\n\
\ \"acc_norm_stderr\": 0.0161824107306827\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5924855491329479,\n \"acc_stderr\": 0.026454578146931505,\n\
\ \"acc_norm\": 0.5924855491329479,\n \"acc_norm_stderr\": 0.026454578146931505\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.38324022346368714,\n\
\ \"acc_stderr\": 0.016260159604429128,\n \"acc_norm\": 0.38324022346368714,\n\
\ \"acc_norm_stderr\": 0.016260159604429128\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.028431095444176643,\n\
\ \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.028431095444176643\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6302250803858521,\n\
\ \"acc_stderr\": 0.027417996705630998,\n \"acc_norm\": 0.6302250803858521,\n\
\ \"acc_norm_stderr\": 0.027417996705630998\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6080246913580247,\n \"acc_stderr\": 0.027163686038271146,\n\
\ \"acc_norm\": 0.6080246913580247,\n \"acc_norm_stderr\": 0.027163686038271146\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3971631205673759,\n \"acc_stderr\": 0.029189805673587102,\n \
\ \"acc_norm\": 0.3971631205673759,\n \"acc_norm_stderr\": 0.029189805673587102\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41395045632333766,\n\
\ \"acc_stderr\": 0.012579699631289265,\n \"acc_norm\": 0.41395045632333766,\n\
\ \"acc_norm_stderr\": 0.012579699631289265\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4742647058823529,\n \"acc_stderr\": 0.03033257809455504,\n\
\ \"acc_norm\": 0.4742647058823529,\n \"acc_norm_stderr\": 0.03033257809455504\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5,\n \"acc_stderr\": 0.020227834851568375,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.020227834851568375\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n\
\ \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6693877551020408,\n\
\ \"acc_stderr\": 0.030116426296540603,\n \"acc_norm\": 0.6693877551020408,\n\
\ \"acc_norm_stderr\": 0.030116426296540603\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.7562189054726368,\n \"acc_stderr\": 0.030360490154014635,\n\
\ \"acc_norm\": 0.7562189054726368,\n \"acc_norm_stderr\": 0.030360490154014635\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.42771084337349397,\n \"acc_stderr\": 0.038515976837185335,\n\
\ \"acc_norm\": 0.42771084337349397,\n \"acc_norm_stderr\": 0.038515976837185335\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7192982456140351,\n\
\ \"acc_stderr\": 0.034462962170884265,\n \"acc_norm\": 0.7192982456140351,\n\
\ \"acc_norm_stderr\": 0.034462962170884265\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.30354957160342716,\n \"mc1_stderr\": 0.016095884155386854,\n\
\ \"mc2\": 0.45211730267295,\n \"mc2_stderr\": 0.014761798027932871\n\
\ }\n}\n```"
repo_url: https://huggingface.co/uukuguy/speechless-codellama-34b-v2.0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_08T21_55_38.209151
path:
- '**/details_harness|arc:challenge|25_2023-10-08T21-55-38.209151.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-08T21-55-38.209151.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_08T21_55_38.209151
path:
- '**/details_harness|hellaswag|10_2023-10-08T21-55-38.209151.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-08T21-55-38.209151.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_08T21_55_38.209151
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T21-55-38.209151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T21-55-38.209151.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T21-55-38.209151.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_08T21_55_38.209151
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T21-55-38.209151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T21-55-38.209151.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_08T21_55_38.209151
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T21-55-38.209151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T21-55-38.209151.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_08T21_55_38.209151
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T21-55-38.209151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T21-55-38.209151.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_08T21_55_38.209151
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T21-55-38.209151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T21-55-38.209151.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_08T21_55_38.209151
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T21-55-38.209151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T21-55-38.209151.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_08T21_55_38.209151
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T21-55-38.209151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T21-55-38.209151.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_08T21_55_38.209151
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T21-55-38.209151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T21-55-38.209151.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_08T21_55_38.209151
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T21-55-38.209151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T21-55-38.209151.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_08T21_55_38.209151
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T21-55-38.209151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T21-55-38.209151.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_08T21_55_38.209151
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T21-55-38.209151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T21-55-38.209151.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_08T21_55_38.209151
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T21-55-38.209151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T21-55-38.209151.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_08T21_55_38.209151
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T21-55-38.209151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T21-55-38.209151.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_08T21_55_38.209151
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T21-55-38.209151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T21-55-38.209151.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_08T21_55_38.209151
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T21-55-38.209151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T21-55-38.209151.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_08T21_55_38.209151
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T21-55-38.209151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T21-55-38.209151.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_08T21_55_38.209151
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T21-55-38.209151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T21-55-38.209151.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_08T21_55_38.209151
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T21-55-38.209151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T21-55-38.209151.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_08T21_55_38.209151
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T21-55-38.209151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T21-55-38.209151.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_08T21_55_38.209151
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T21-55-38.209151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T21-55-38.209151.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_08T21_55_38.209151
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T21-55-38.209151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T21-55-38.209151.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_08T21_55_38.209151
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T21-55-38.209151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T21-55-38.209151.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_08T21_55_38.209151
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T21-55-38.209151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T21-55-38.209151.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_08T21_55_38.209151
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T21-55-38.209151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T21-55-38.209151.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_08T21_55_38.209151
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T21-55-38.209151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T21-55-38.209151.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_08T21_55_38.209151
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T21-55-38.209151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T21-55-38.209151.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_08T21_55_38.209151
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T21-55-38.209151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T21-55-38.209151.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_08T21_55_38.209151
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T21-55-38.209151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T21-55-38.209151.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_08T21_55_38.209151
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T21-55-38.209151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T21-55-38.209151.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_08T21_55_38.209151
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T21-55-38.209151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T21-55-38.209151.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_08T21_55_38.209151
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T21-55-38.209151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T21-55-38.209151.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_08T21_55_38.209151
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T21-55-38.209151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T21-55-38.209151.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_08T21_55_38.209151
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T21-55-38.209151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T21-55-38.209151.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_08T21_55_38.209151
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T21-55-38.209151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T21-55-38.209151.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_08T21_55_38.209151
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T21-55-38.209151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T21-55-38.209151.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_08T21_55_38.209151
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T21-55-38.209151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T21-55-38.209151.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_08T21_55_38.209151
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T21-55-38.209151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T21-55-38.209151.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_08T21_55_38.209151
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T21-55-38.209151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T21-55-38.209151.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_08T21_55_38.209151
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T21-55-38.209151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T21-55-38.209151.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_08T21_55_38.209151
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-08T21-55-38.209151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-08T21-55-38.209151.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_08T21_55_38.209151
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T21-55-38.209151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T21-55-38.209151.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_08T21_55_38.209151
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T21-55-38.209151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T21-55-38.209151.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_08T21_55_38.209151
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T21-55-38.209151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T21-55-38.209151.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_08T21_55_38.209151
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T21-55-38.209151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T21-55-38.209151.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_08T21_55_38.209151
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T21-55-38.209151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T21-55-38.209151.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_08T21_55_38.209151
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T21-55-38.209151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T21-55-38.209151.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_08T21_55_38.209151
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T21-55-38.209151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T21-55-38.209151.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_08T21_55_38.209151
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T21-55-38.209151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T21-55-38.209151.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_08T21_55_38.209151
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T21-55-38.209151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T21-55-38.209151.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_08T21_55_38.209151
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T21-55-38.209151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T21-55-38.209151.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_08T21_55_38.209151
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T21-55-38.209151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T21-55-38.209151.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_08T21_55_38.209151
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T21-55-38.209151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T21-55-38.209151.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_08T21_55_38.209151
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T21-55-38.209151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T21-55-38.209151.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_08T21_55_38.209151
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T21-55-38.209151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T21-55-38.209151.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_08T21_55_38.209151
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T21-55-38.209151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T21-55-38.209151.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_08T21_55_38.209151
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T21-55-38.209151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T21-55-38.209151.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_08T21_55_38.209151
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T21-55-38.209151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T21-55-38.209151.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_08T21_55_38.209151
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T21-55-38.209151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T21-55-38.209151.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_08T21_55_38.209151
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-08T21-55-38.209151.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-08T21-55-38.209151.parquet'
- config_name: results
data_files:
- split: 2023_10_08T21_55_38.209151
path:
- results_2023-10-08T21-55-38.209151.parquet
- split: latest
path:
- results_2023-10-08T21-55-38.209151.parquet
---
# Dataset Card for Evaluation run of uukuguy/speechless-codellama-34b-v2.0
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/uukuguy/speechless-codellama-34b-v2.0
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [uukuguy/speechless-codellama-34b-v2.0](https://huggingface.co/uukuguy/speechless-codellama-34b-v2.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_uukuguy__speechless-codellama-34b-v2.0",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-08T21:55:38.209151](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-codellama-34b-v2.0/blob/main/results_2023-10-08T21-55-38.209151.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5464126519732339,
"acc_stderr": 0.03484059078525412,
"acc_norm": 0.5502254949771422,
"acc_norm_stderr": 0.034828419990997346,
"mc1": 0.30354957160342716,
"mc1_stderr": 0.016095884155386854,
"mc2": 0.45211730267295,
"mc2_stderr": 0.014761798027932871
},
"harness|arc:challenge|25": {
"acc": 0.5119453924914675,
"acc_stderr": 0.014607220340597171,
"acc_norm": 0.5435153583617748,
"acc_norm_stderr": 0.01455594976049644
},
"harness|hellaswag|10": {
"acc": 0.5631348336984664,
"acc_stderr": 0.004949842967331427,
"acc_norm": 0.7565226050587532,
"acc_norm_stderr": 0.004283036686282379
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4148148148148148,
"acc_stderr": 0.042561937679014075,
"acc_norm": 0.4148148148148148,
"acc_norm_stderr": 0.042561937679014075
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5723684210526315,
"acc_stderr": 0.04026097083296562,
"acc_norm": 0.5723684210526315,
"acc_norm_stderr": 0.04026097083296562
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.49056603773584906,
"acc_stderr": 0.0307673947078081,
"acc_norm": 0.49056603773584906,
"acc_norm_stderr": 0.0307673947078081
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5625,
"acc_stderr": 0.04148415739394154,
"acc_norm": 0.5625,
"acc_norm_stderr": 0.04148415739394154
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4393063583815029,
"acc_stderr": 0.037842719328874674,
"acc_norm": 0.4393063583815029,
"acc_norm_stderr": 0.037842719328874674
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.044405219061793275,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.044405219061793275
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.03267862331014063,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.03267862331014063
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.38596491228070173,
"acc_stderr": 0.04579639422070434,
"acc_norm": 0.38596491228070173,
"acc_norm_stderr": 0.04579639422070434
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.025424835086923992,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.025424835086923992
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743743,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743743
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6419354838709678,
"acc_stderr": 0.02727389059430064,
"acc_norm": 0.6419354838709678,
"acc_norm_stderr": 0.02727389059430064
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4088669950738916,
"acc_stderr": 0.034590588158832314,
"acc_norm": 0.4088669950738916,
"acc_norm_stderr": 0.034590588158832314
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7151515151515152,
"acc_stderr": 0.03524390844511781,
"acc_norm": 0.7151515151515152,
"acc_norm_stderr": 0.03524390844511781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.696969696969697,
"acc_stderr": 0.032742879140268674,
"acc_norm": 0.696969696969697,
"acc_norm_stderr": 0.032742879140268674
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7357512953367875,
"acc_stderr": 0.031821550509166456,
"acc_norm": 0.7357512953367875,
"acc_norm_stderr": 0.031821550509166456
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4948717948717949,
"acc_stderr": 0.025349672906838653,
"acc_norm": 0.4948717948717949,
"acc_norm_stderr": 0.025349672906838653
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.02742001935094527,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.02742001935094527
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5504201680672269,
"acc_stderr": 0.03231293497137707,
"acc_norm": 0.5504201680672269,
"acc_norm_stderr": 0.03231293497137707
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7339449541284404,
"acc_stderr": 0.018946022322225604,
"acc_norm": 0.7339449541284404,
"acc_norm_stderr": 0.018946022322225604
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.44907407407407407,
"acc_stderr": 0.03392238405321616,
"acc_norm": 0.44907407407407407,
"acc_norm_stderr": 0.03392238405321616
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7401960784313726,
"acc_stderr": 0.03077855467869326,
"acc_norm": 0.7401960784313726,
"acc_norm_stderr": 0.03077855467869326
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.759493670886076,
"acc_stderr": 0.02782078198114969,
"acc_norm": 0.759493670886076,
"acc_norm_stderr": 0.02782078198114969
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5695067264573991,
"acc_stderr": 0.033231973029429394,
"acc_norm": 0.5695067264573991,
"acc_norm_stderr": 0.033231973029429394
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.4961832061068702,
"acc_stderr": 0.043851623256015534,
"acc_norm": 0.4961832061068702,
"acc_norm_stderr": 0.043851623256015534
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7107438016528925,
"acc_stderr": 0.04139112727635463,
"acc_norm": 0.7107438016528925,
"acc_norm_stderr": 0.04139112727635463
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6481481481481481,
"acc_stderr": 0.04616631111801713,
"acc_norm": 0.6481481481481481,
"acc_norm_stderr": 0.04616631111801713
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6932515337423313,
"acc_stderr": 0.036230899157241474,
"acc_norm": 0.6932515337423313,
"acc_norm_stderr": 0.036230899157241474
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.7087378640776699,
"acc_stderr": 0.04498676320572924,
"acc_norm": 0.7087378640776699,
"acc_norm_stderr": 0.04498676320572924
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7905982905982906,
"acc_stderr": 0.026655699653922744,
"acc_norm": 0.7905982905982906,
"acc_norm_stderr": 0.026655699653922744
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7126436781609196,
"acc_stderr": 0.0161824107306827,
"acc_norm": 0.7126436781609196,
"acc_norm_stderr": 0.0161824107306827
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5924855491329479,
"acc_stderr": 0.026454578146931505,
"acc_norm": 0.5924855491329479,
"acc_norm_stderr": 0.026454578146931505
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.38324022346368714,
"acc_stderr": 0.016260159604429128,
"acc_norm": 0.38324022346368714,
"acc_norm_stderr": 0.016260159604429128
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.028431095444176643,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.028431095444176643
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6302250803858521,
"acc_stderr": 0.027417996705630998,
"acc_norm": 0.6302250803858521,
"acc_norm_stderr": 0.027417996705630998
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6080246913580247,
"acc_stderr": 0.027163686038271146,
"acc_norm": 0.6080246913580247,
"acc_norm_stderr": 0.027163686038271146
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3971631205673759,
"acc_stderr": 0.029189805673587102,
"acc_norm": 0.3971631205673759,
"acc_norm_stderr": 0.029189805673587102
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.41395045632333766,
"acc_stderr": 0.012579699631289265,
"acc_norm": 0.41395045632333766,
"acc_norm_stderr": 0.012579699631289265
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4742647058823529,
"acc_stderr": 0.03033257809455504,
"acc_norm": 0.4742647058823529,
"acc_norm_stderr": 0.03033257809455504
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5,
"acc_stderr": 0.020227834851568375,
"acc_norm": 0.5,
"acc_norm_stderr": 0.020227834851568375
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6693877551020408,
"acc_stderr": 0.030116426296540603,
"acc_norm": 0.6693877551020408,
"acc_norm_stderr": 0.030116426296540603
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7562189054726368,
"acc_stderr": 0.030360490154014635,
"acc_norm": 0.7562189054726368,
"acc_norm_stderr": 0.030360490154014635
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42771084337349397,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.42771084337349397,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7192982456140351,
"acc_stderr": 0.034462962170884265,
"acc_norm": 0.7192982456140351,
"acc_norm_stderr": 0.034462962170884265
},
"harness|truthfulqa:mc|0": {
"mc1": 0.30354957160342716,
"mc1_stderr": 0.016095884155386854,
"mc2": 0.45211730267295,
"mc2_stderr": 0.014761798027932871
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
munyamakosa/creditscoring | 2023-10-08T22:22:44.000Z | [
"region:us"
] | munyamakosa | null | null | null | 0 | 0 | Entry not found |
Joisas/s | 2023-10-08T22:49:53.000Z | [
"region:us"
] | Joisas | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_Aeala__Alpaca-elina-65b | 2023-10-08T22:34:40.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Aeala/Alpaca-elina-65b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Aeala/Alpaca-elina-65b](https://huggingface.co/Aeala/Alpaca-elina-65b) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Aeala__Alpaca-elina-65b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-08T22:34:28.379829](https://huggingface.co/datasets/open-llm-leaderboard/details_Aeala__Alpaca-elina-65b/blob/main/results_2023-10-08T22-34-28.379829.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.27736996644295303,\n\
\ \"em_stderr\": 0.004584873651869028,\n \"f1\": 0.33694211409395997,\n\
\ \"f1_stderr\": 0.004497646539610947,\n \"acc\": 0.5520523608267965,\n\
\ \"acc_stderr\": 0.011722735218747584\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.27736996644295303,\n \"em_stderr\": 0.004584873651869028,\n\
\ \"f1\": 0.33694211409395997,\n \"f1_stderr\": 0.004497646539610947\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.29037149355572406,\n \
\ \"acc_stderr\": 0.012503592481818955\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.813733228097869,\n \"acc_stderr\": 0.010941877955676211\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Aeala/Alpaca-elina-65b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_10_08T22_34_28.379829
path:
- '**/details_harness|drop|3_2023-10-08T22-34-28.379829.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-08T22-34-28.379829.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_08T22_34_28.379829
path:
- '**/details_harness|gsm8k|5_2023-10-08T22-34-28.379829.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-08T22-34-28.379829.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_08T22_34_28.379829
path:
- '**/details_harness|winogrande|5_2023-10-08T22-34-28.379829.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-08T22-34-28.379829.parquet'
- config_name: results
data_files:
- split: 2023_10_08T22_34_28.379829
path:
- results_2023-10-08T22-34-28.379829.parquet
- split: latest
path:
- results_2023-10-08T22-34-28.379829.parquet
---
# Dataset Card for Evaluation run of Aeala/Alpaca-elina-65b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Aeala/Alpaca-elina-65b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Aeala/Alpaca-elina-65b](https://huggingface.co/Aeala/Alpaca-elina-65b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Aeala__Alpaca-elina-65b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-08T22:34:28.379829](https://huggingface.co/datasets/open-llm-leaderboard/details_Aeala__Alpaca-elina-65b/blob/main/results_2023-10-08T22-34-28.379829.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.27736996644295303,
"em_stderr": 0.004584873651869028,
"f1": 0.33694211409395997,
"f1_stderr": 0.004497646539610947,
"acc": 0.5520523608267965,
"acc_stderr": 0.011722735218747584
},
"harness|drop|3": {
"em": 0.27736996644295303,
"em_stderr": 0.004584873651869028,
"f1": 0.33694211409395997,
"f1_stderr": 0.004497646539610947
},
"harness|gsm8k|5": {
"acc": 0.29037149355572406,
"acc_stderr": 0.012503592481818955
},
"harness|winogrande|5": {
"acc": 0.813733228097869,
"acc_stderr": 0.010941877955676211
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
lifarias/Rasha | 2023-10-08T23:14:32.000Z | [
"license:openrail",
"region:us"
] | lifarias | null | null | null | 0 | 0 | ---
license: openrail
---
|
open-llm-leaderboard/details_u-chom__preded-title-amazongoogle-abtbuy | 2023-10-08T23:28:24.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of u-chom/preded-title-amazongoogle-abtbuy
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [u-chom/preded-title-amazongoogle-abtbuy](https://huggingface.co/u-chom/preded-title-amazongoogle-abtbuy)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_u-chom__preded-title-amazongoogle-abtbuy\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-08T23:27:01.372351](https://huggingface.co/datasets/open-llm-leaderboard/details_u-chom__preded-title-amazongoogle-abtbuy/blob/main/results_2023-10-08T23-27-01.372351.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.38488561065128146,\n\
\ \"acc_stderr\": 0.03460083230388379,\n \"acc_norm\": 0.38889827098513907,\n\
\ \"acc_norm_stderr\": 0.034587970268505575,\n \"mc1\": 0.2692778457772338,\n\
\ \"mc1_stderr\": 0.015528566637087281,\n \"mc2\": 0.4164930056701617,\n\
\ \"mc2_stderr\": 0.013916947335276144\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4667235494880546,\n \"acc_stderr\": 0.014578995859605808,\n\
\ \"acc_norm\": 0.5093856655290102,\n \"acc_norm_stderr\": 0.014608816322065\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5873332005576578,\n\
\ \"acc_stderr\": 0.00491307684443376,\n \"acc_norm\": 0.7814180442143,\n\
\ \"acc_norm_stderr\": 0.004124396294659584\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4222222222222222,\n\
\ \"acc_stderr\": 0.042667634040995814,\n \"acc_norm\": 0.4222222222222222,\n\
\ \"acc_norm_stderr\": 0.042667634040995814\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.34868421052631576,\n \"acc_stderr\": 0.038781398887976104,\n\
\ \"acc_norm\": 0.34868421052631576,\n \"acc_norm_stderr\": 0.038781398887976104\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.43,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.3584905660377358,\n \"acc_stderr\": 0.02951470358398177,\n\
\ \"acc_norm\": 0.3584905660377358,\n \"acc_norm_stderr\": 0.02951470358398177\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3888888888888889,\n\
\ \"acc_stderr\": 0.04076663253918567,\n \"acc_norm\": 0.3888888888888889,\n\
\ \"acc_norm_stderr\": 0.04076663253918567\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\"\
: 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3236994219653179,\n\
\ \"acc_stderr\": 0.0356760379963917,\n \"acc_norm\": 0.3236994219653179,\n\
\ \"acc_norm_stderr\": 0.0356760379963917\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617747,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617747\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n\
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3829787234042553,\n \"acc_stderr\": 0.03177821250236922,\n\
\ \"acc_norm\": 0.3829787234042553,\n \"acc_norm_stderr\": 0.03177821250236922\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\
\ \"acc_stderr\": 0.040493392977481425,\n \"acc_norm\": 0.24561403508771928,\n\
\ \"acc_norm_stderr\": 0.040493392977481425\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.3586206896551724,\n \"acc_stderr\": 0.03996629574876719,\n\
\ \"acc_norm\": 0.3586206896551724,\n \"acc_norm_stderr\": 0.03996629574876719\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.21164021164021163,\n \"acc_stderr\": 0.021037331505262883,\n \"\
acc_norm\": 0.21164021164021163,\n \"acc_norm_stderr\": 0.021037331505262883\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.31746031746031744,\n\
\ \"acc_stderr\": 0.04163453031302859,\n \"acc_norm\": 0.31746031746031744,\n\
\ \"acc_norm_stderr\": 0.04163453031302859\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.35161290322580646,\n \"acc_stderr\": 0.027162537826948458,\n \"\
acc_norm\": 0.35161290322580646,\n \"acc_norm_stderr\": 0.027162537826948458\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.2413793103448276,\n \"acc_stderr\": 0.030108330718011625,\n \"\
acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.030108330718011625\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\"\
: 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.4484848484848485,\n \"acc_stderr\": 0.038835659779569286,\n\
\ \"acc_norm\": 0.4484848484848485,\n \"acc_norm_stderr\": 0.038835659779569286\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.40404040404040403,\n \"acc_stderr\": 0.03496130972056127,\n \"\
acc_norm\": 0.40404040404040403,\n \"acc_norm_stderr\": 0.03496130972056127\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.5492227979274611,\n \"acc_stderr\": 0.03590910952235524,\n\
\ \"acc_norm\": 0.5492227979274611,\n \"acc_norm_stderr\": 0.03590910952235524\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.32564102564102565,\n \"acc_stderr\": 0.02375966576741229,\n\
\ \"acc_norm\": 0.32564102564102565,\n \"acc_norm_stderr\": 0.02375966576741229\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.21851851851851853,\n \"acc_stderr\": 0.02519575225182379,\n \
\ \"acc_norm\": 0.21851851851851853,\n \"acc_norm_stderr\": 0.02519575225182379\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3445378151260504,\n \"acc_stderr\": 0.030868682604121626,\n\
\ \"acc_norm\": 0.3445378151260504,\n \"acc_norm_stderr\": 0.030868682604121626\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2251655629139073,\n \"acc_stderr\": 0.03410435282008937,\n \"\
acc_norm\": 0.2251655629139073,\n \"acc_norm_stderr\": 0.03410435282008937\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.43853211009174314,\n \"acc_stderr\": 0.021274713073954562,\n \"\
acc_norm\": 0.43853211009174314,\n \"acc_norm_stderr\": 0.021274713073954562\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.1712962962962963,\n \"acc_stderr\": 0.025695341643824685,\n \"\
acc_norm\": 0.1712962962962963,\n \"acc_norm_stderr\": 0.025695341643824685\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.4019607843137255,\n \"acc_stderr\": 0.034411900234824655,\n \"\
acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.034411900234824655\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.3881856540084388,\n \"acc_stderr\": 0.0317229500433233,\n \
\ \"acc_norm\": 0.3881856540084388,\n \"acc_norm_stderr\": 0.0317229500433233\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4798206278026906,\n\
\ \"acc_stderr\": 0.033530461674123,\n \"acc_norm\": 0.4798206278026906,\n\
\ \"acc_norm_stderr\": 0.033530461674123\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.4198473282442748,\n \"acc_stderr\": 0.04328577215262972,\n\
\ \"acc_norm\": 0.4198473282442748,\n \"acc_norm_stderr\": 0.04328577215262972\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.512396694214876,\n \"acc_stderr\": 0.045629515481807666,\n \"\
acc_norm\": 0.512396694214876,\n \"acc_norm_stderr\": 0.045629515481807666\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.04803752235190193,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.04803752235190193\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.4049079754601227,\n \"acc_stderr\": 0.038566721635489125,\n\
\ \"acc_norm\": 0.4049079754601227,\n \"acc_norm_stderr\": 0.038566721635489125\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3482142857142857,\n\
\ \"acc_stderr\": 0.04521829902833586,\n \"acc_norm\": 0.3482142857142857,\n\
\ \"acc_norm_stderr\": 0.04521829902833586\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.4077669902912621,\n \"acc_stderr\": 0.048657775704107675,\n\
\ \"acc_norm\": 0.4077669902912621,\n \"acc_norm_stderr\": 0.048657775704107675\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6367521367521367,\n\
\ \"acc_stderr\": 0.03150712523091264,\n \"acc_norm\": 0.6367521367521367,\n\
\ \"acc_norm_stderr\": 0.03150712523091264\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-miscellaneous|5\"\
: {\n \"acc\": 0.5491698595146871,\n \"acc_stderr\": 0.017793297572699034,\n\
\ \"acc_norm\": 0.5491698595146871,\n \"acc_norm_stderr\": 0.017793297572699034\n\
\ },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.42196531791907516,\n\
\ \"acc_stderr\": 0.02658923114217426,\n \"acc_norm\": 0.42196531791907516,\n\
\ \"acc_norm_stderr\": 0.02658923114217426\n },\n \"harness|hendrycksTest-moral_scenarios|5\"\
: {\n \"acc\": 0.23798882681564246,\n \"acc_stderr\": 0.014242630070574915,\n\
\ \"acc_norm\": 0.23798882681564246,\n \"acc_norm_stderr\": 0.014242630070574915\n\
\ },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.40522875816993464,\n\
\ \"acc_stderr\": 0.02811092849280907,\n \"acc_norm\": 0.40522875816993464,\n\
\ \"acc_norm_stderr\": 0.02811092849280907\n },\n \"harness|hendrycksTest-philosophy|5\"\
: {\n \"acc\": 0.5080385852090032,\n \"acc_stderr\": 0.028394421370984538,\n\
\ \"acc_norm\": 0.5080385852090032,\n \"acc_norm_stderr\": 0.028394421370984538\n\
\ },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.43209876543209874,\n\
\ \"acc_stderr\": 0.027563010971606676,\n \"acc_norm\": 0.43209876543209874,\n\
\ \"acc_norm_stderr\": 0.027563010971606676\n },\n \"harness|hendrycksTest-professional_accounting|5\"\
: {\n \"acc\": 0.30141843971631205,\n \"acc_stderr\": 0.02737412888263115,\n\
\ \"acc_norm\": 0.30141843971631205,\n \"acc_norm_stderr\": 0.02737412888263115\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.33572359843546284,\n\
\ \"acc_stderr\": 0.012061304157664607,\n \"acc_norm\": 0.33572359843546284,\n\
\ \"acc_norm_stderr\": 0.012061304157664607\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.24632352941176472,\n \"acc_stderr\": 0.02617343857052,\n\
\ \"acc_norm\": 0.24632352941176472,\n \"acc_norm_stderr\": 0.02617343857052\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4035947712418301,\n \"acc_stderr\": 0.019848280168401154,\n \
\ \"acc_norm\": 0.4035947712418301,\n \"acc_norm_stderr\": 0.019848280168401154\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4636363636363636,\n\
\ \"acc_stderr\": 0.047764491623961985,\n \"acc_norm\": 0.4636363636363636,\n\
\ \"acc_norm_stderr\": 0.047764491623961985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2653061224489796,\n \"acc_stderr\": 0.028263889943784606,\n\
\ \"acc_norm\": 0.2653061224489796,\n \"acc_norm_stderr\": 0.028263889943784606\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5572139303482587,\n\
\ \"acc_stderr\": 0.03512310964123937,\n \"acc_norm\": 0.5572139303482587,\n\
\ \"acc_norm_stderr\": 0.03512310964123937\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.3855421686746988,\n \"acc_stderr\": 0.037891344246115496,\n\
\ \"acc_norm\": 0.3855421686746988,\n \"acc_norm_stderr\": 0.037891344246115496\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.6140350877192983,\n\
\ \"acc_stderr\": 0.03733756969066165,\n \"acc_norm\": 0.6140350877192983,\n\
\ \"acc_norm_stderr\": 0.03733756969066165\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.2692778457772338,\n \"mc1_stderr\": 0.015528566637087281,\n\
\ \"mc2\": 0.4164930056701617,\n \"mc2_stderr\": 0.013916947335276144\n\
\ }\n}\n```"
repo_url: https://huggingface.co/u-chom/preded-title-amazongoogle-abtbuy
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_08T23_27_01.372351
path:
- '**/details_harness|arc:challenge|25_2023-10-08T23-27-01.372351.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-08T23-27-01.372351.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_08T23_27_01.372351
path:
- '**/details_harness|hellaswag|10_2023-10-08T23-27-01.372351.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-08T23-27-01.372351.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_08T23_27_01.372351
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T23-27-01.372351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T23-27-01.372351.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T23-27-01.372351.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_08T23_27_01.372351
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T23-27-01.372351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T23-27-01.372351.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_08T23_27_01.372351
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T23-27-01.372351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T23-27-01.372351.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_08T23_27_01.372351
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T23-27-01.372351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T23-27-01.372351.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_08T23_27_01.372351
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T23-27-01.372351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T23-27-01.372351.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_08T23_27_01.372351
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T23-27-01.372351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T23-27-01.372351.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_08T23_27_01.372351
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T23-27-01.372351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T23-27-01.372351.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_08T23_27_01.372351
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T23-27-01.372351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T23-27-01.372351.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_08T23_27_01.372351
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T23-27-01.372351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T23-27-01.372351.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_08T23_27_01.372351
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T23-27-01.372351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T23-27-01.372351.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_08T23_27_01.372351
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T23-27-01.372351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T23-27-01.372351.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_08T23_27_01.372351
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T23-27-01.372351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T23-27-01.372351.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_08T23_27_01.372351
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T23-27-01.372351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T23-27-01.372351.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_08T23_27_01.372351
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T23-27-01.372351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T23-27-01.372351.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_08T23_27_01.372351
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T23-27-01.372351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T23-27-01.372351.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_08T23_27_01.372351
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T23-27-01.372351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T23-27-01.372351.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_08T23_27_01.372351
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T23-27-01.372351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T23-27-01.372351.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_08T23_27_01.372351
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T23-27-01.372351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T23-27-01.372351.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_08T23_27_01.372351
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T23-27-01.372351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T23-27-01.372351.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_08T23_27_01.372351
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T23-27-01.372351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T23-27-01.372351.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_08T23_27_01.372351
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T23-27-01.372351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T23-27-01.372351.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_08T23_27_01.372351
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T23-27-01.372351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T23-27-01.372351.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_08T23_27_01.372351
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T23-27-01.372351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T23-27-01.372351.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_08T23_27_01.372351
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T23-27-01.372351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T23-27-01.372351.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_08T23_27_01.372351
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T23-27-01.372351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T23-27-01.372351.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_08T23_27_01.372351
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T23-27-01.372351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T23-27-01.372351.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_08T23_27_01.372351
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T23-27-01.372351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T23-27-01.372351.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_08T23_27_01.372351
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T23-27-01.372351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T23-27-01.372351.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_08T23_27_01.372351
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T23-27-01.372351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T23-27-01.372351.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_08T23_27_01.372351
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T23-27-01.372351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T23-27-01.372351.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_08T23_27_01.372351
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T23-27-01.372351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T23-27-01.372351.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_08T23_27_01.372351
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T23-27-01.372351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T23-27-01.372351.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_08T23_27_01.372351
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T23-27-01.372351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T23-27-01.372351.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_08T23_27_01.372351
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T23-27-01.372351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T23-27-01.372351.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_08T23_27_01.372351
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T23-27-01.372351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T23-27-01.372351.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_08T23_27_01.372351
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T23-27-01.372351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T23-27-01.372351.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_08T23_27_01.372351
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T23-27-01.372351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T23-27-01.372351.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_08T23_27_01.372351
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T23-27-01.372351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T23-27-01.372351.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_08T23_27_01.372351
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T23-27-01.372351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T23-27-01.372351.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_08T23_27_01.372351
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-08T23-27-01.372351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-08T23-27-01.372351.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_08T23_27_01.372351
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T23-27-01.372351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T23-27-01.372351.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_08T23_27_01.372351
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T23-27-01.372351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T23-27-01.372351.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_08T23_27_01.372351
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T23-27-01.372351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T23-27-01.372351.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_08T23_27_01.372351
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T23-27-01.372351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T23-27-01.372351.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_08T23_27_01.372351
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T23-27-01.372351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T23-27-01.372351.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_08T23_27_01.372351
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T23-27-01.372351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T23-27-01.372351.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_08T23_27_01.372351
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T23-27-01.372351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T23-27-01.372351.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_08T23_27_01.372351
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T23-27-01.372351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T23-27-01.372351.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_08T23_27_01.372351
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T23-27-01.372351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T23-27-01.372351.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_08T23_27_01.372351
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T23-27-01.372351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T23-27-01.372351.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_08T23_27_01.372351
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T23-27-01.372351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T23-27-01.372351.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_08T23_27_01.372351
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T23-27-01.372351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T23-27-01.372351.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_08T23_27_01.372351
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T23-27-01.372351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T23-27-01.372351.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_08T23_27_01.372351
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T23-27-01.372351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T23-27-01.372351.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_08T23_27_01.372351
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T23-27-01.372351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T23-27-01.372351.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_08T23_27_01.372351
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T23-27-01.372351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T23-27-01.372351.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_08T23_27_01.372351
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T23-27-01.372351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T23-27-01.372351.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_08T23_27_01.372351
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T23-27-01.372351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T23-27-01.372351.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_08T23_27_01.372351
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-08T23-27-01.372351.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-08T23-27-01.372351.parquet'
- config_name: results
data_files:
- split: 2023_10_08T23_27_01.372351
path:
- results_2023-10-08T23-27-01.372351.parquet
- split: latest
path:
- results_2023-10-08T23-27-01.372351.parquet
---
# Dataset Card for Evaluation run of u-chom/preded-title-amazongoogle-abtbuy
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/u-chom/preded-title-amazongoogle-abtbuy
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [u-chom/preded-title-amazongoogle-abtbuy](https://huggingface.co/u-chom/preded-title-amazongoogle-abtbuy) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_u-chom__preded-title-amazongoogle-abtbuy",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-08T23:27:01.372351](https://huggingface.co/datasets/open-llm-leaderboard/details_u-chom__preded-title-amazongoogle-abtbuy/blob/main/results_2023-10-08T23-27-01.372351.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.38488561065128146,
"acc_stderr": 0.03460083230388379,
"acc_norm": 0.38889827098513907,
"acc_norm_stderr": 0.034587970268505575,
"mc1": 0.2692778457772338,
"mc1_stderr": 0.015528566637087281,
"mc2": 0.4164930056701617,
"mc2_stderr": 0.013916947335276144
},
"harness|arc:challenge|25": {
"acc": 0.4667235494880546,
"acc_stderr": 0.014578995859605808,
"acc_norm": 0.5093856655290102,
"acc_norm_stderr": 0.014608816322065
},
"harness|hellaswag|10": {
"acc": 0.5873332005576578,
"acc_stderr": 0.00491307684443376,
"acc_norm": 0.7814180442143,
"acc_norm_stderr": 0.004124396294659584
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4222222222222222,
"acc_stderr": 0.042667634040995814,
"acc_norm": 0.4222222222222222,
"acc_norm_stderr": 0.042667634040995814
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.34868421052631576,
"acc_stderr": 0.038781398887976104,
"acc_norm": 0.34868421052631576,
"acc_norm_stderr": 0.038781398887976104
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.3584905660377358,
"acc_stderr": 0.02951470358398177,
"acc_norm": 0.3584905660377358,
"acc_norm_stderr": 0.02951470358398177
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04076663253918567,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04076663253918567
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3236994219653179,
"acc_stderr": 0.0356760379963917,
"acc_norm": 0.3236994219653179,
"acc_norm_stderr": 0.0356760379963917
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617747,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617747
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3829787234042553,
"acc_stderr": 0.03177821250236922,
"acc_norm": 0.3829787234042553,
"acc_norm_stderr": 0.03177821250236922
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.040493392977481425,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.040493392977481425
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.3586206896551724,
"acc_stderr": 0.03996629574876719,
"acc_norm": 0.3586206896551724,
"acc_norm_stderr": 0.03996629574876719
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.21164021164021163,
"acc_stderr": 0.021037331505262883,
"acc_norm": 0.21164021164021163,
"acc_norm_stderr": 0.021037331505262883
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.04163453031302859,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.04163453031302859
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.35161290322580646,
"acc_stderr": 0.027162537826948458,
"acc_norm": 0.35161290322580646,
"acc_norm_stderr": 0.027162537826948458
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.030108330718011625,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.030108330718011625
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.4484848484848485,
"acc_stderr": 0.038835659779569286,
"acc_norm": 0.4484848484848485,
"acc_norm_stderr": 0.038835659779569286
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.40404040404040403,
"acc_stderr": 0.03496130972056127,
"acc_norm": 0.40404040404040403,
"acc_norm_stderr": 0.03496130972056127
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.5492227979274611,
"acc_stderr": 0.03590910952235524,
"acc_norm": 0.5492227979274611,
"acc_norm_stderr": 0.03590910952235524
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.32564102564102565,
"acc_stderr": 0.02375966576741229,
"acc_norm": 0.32564102564102565,
"acc_norm_stderr": 0.02375966576741229
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.21851851851851853,
"acc_stderr": 0.02519575225182379,
"acc_norm": 0.21851851851851853,
"acc_norm_stderr": 0.02519575225182379
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3445378151260504,
"acc_stderr": 0.030868682604121626,
"acc_norm": 0.3445378151260504,
"acc_norm_stderr": 0.030868682604121626
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2251655629139073,
"acc_stderr": 0.03410435282008937,
"acc_norm": 0.2251655629139073,
"acc_norm_stderr": 0.03410435282008937
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.43853211009174314,
"acc_stderr": 0.021274713073954562,
"acc_norm": 0.43853211009174314,
"acc_norm_stderr": 0.021274713073954562
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1712962962962963,
"acc_stderr": 0.025695341643824685,
"acc_norm": 0.1712962962962963,
"acc_norm_stderr": 0.025695341643824685
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.034411900234824655,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.034411900234824655
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.3881856540084388,
"acc_stderr": 0.0317229500433233,
"acc_norm": 0.3881856540084388,
"acc_norm_stderr": 0.0317229500433233
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4798206278026906,
"acc_stderr": 0.033530461674123,
"acc_norm": 0.4798206278026906,
"acc_norm_stderr": 0.033530461674123
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.4198473282442748,
"acc_stderr": 0.04328577215262972,
"acc_norm": 0.4198473282442748,
"acc_norm_stderr": 0.04328577215262972
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.512396694214876,
"acc_stderr": 0.045629515481807666,
"acc_norm": 0.512396694214876,
"acc_norm_stderr": 0.045629515481807666
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04803752235190193,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04803752235190193
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4049079754601227,
"acc_stderr": 0.038566721635489125,
"acc_norm": 0.4049079754601227,
"acc_norm_stderr": 0.038566721635489125
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3482142857142857,
"acc_stderr": 0.04521829902833586,
"acc_norm": 0.3482142857142857,
"acc_norm_stderr": 0.04521829902833586
},
"harness|hendrycksTest-management|5": {
"acc": 0.4077669902912621,
"acc_stderr": 0.048657775704107675,
"acc_norm": 0.4077669902912621,
"acc_norm_stderr": 0.048657775704107675
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6367521367521367,
"acc_stderr": 0.03150712523091264,
"acc_norm": 0.6367521367521367,
"acc_norm_stderr": 0.03150712523091264
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5491698595146871,
"acc_stderr": 0.017793297572699034,
"acc_norm": 0.5491698595146871,
"acc_norm_stderr": 0.017793297572699034
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.42196531791907516,
"acc_stderr": 0.02658923114217426,
"acc_norm": 0.42196531791907516,
"acc_norm_stderr": 0.02658923114217426
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.40522875816993464,
"acc_stderr": 0.02811092849280907,
"acc_norm": 0.40522875816993464,
"acc_norm_stderr": 0.02811092849280907
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5080385852090032,
"acc_stderr": 0.028394421370984538,
"acc_norm": 0.5080385852090032,
"acc_norm_stderr": 0.028394421370984538
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.43209876543209874,
"acc_stderr": 0.027563010971606676,
"acc_norm": 0.43209876543209874,
"acc_norm_stderr": 0.027563010971606676
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.30141843971631205,
"acc_stderr": 0.02737412888263115,
"acc_norm": 0.30141843971631205,
"acc_norm_stderr": 0.02737412888263115
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.33572359843546284,
"acc_stderr": 0.012061304157664607,
"acc_norm": 0.33572359843546284,
"acc_norm_stderr": 0.012061304157664607
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.24632352941176472,
"acc_stderr": 0.02617343857052,
"acc_norm": 0.24632352941176472,
"acc_norm_stderr": 0.02617343857052
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4035947712418301,
"acc_stderr": 0.019848280168401154,
"acc_norm": 0.4035947712418301,
"acc_norm_stderr": 0.019848280168401154
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.4636363636363636,
"acc_stderr": 0.047764491623961985,
"acc_norm": 0.4636363636363636,
"acc_norm_stderr": 0.047764491623961985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2653061224489796,
"acc_stderr": 0.028263889943784606,
"acc_norm": 0.2653061224489796,
"acc_norm_stderr": 0.028263889943784606
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5572139303482587,
"acc_stderr": 0.03512310964123937,
"acc_norm": 0.5572139303482587,
"acc_norm_stderr": 0.03512310964123937
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3855421686746988,
"acc_stderr": 0.037891344246115496,
"acc_norm": 0.3855421686746988,
"acc_norm_stderr": 0.037891344246115496
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6140350877192983,
"acc_stderr": 0.03733756969066165,
"acc_norm": 0.6140350877192983,
"acc_norm_stderr": 0.03733756969066165
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2692778457772338,
"mc1_stderr": 0.015528566637087281,
"mc2": 0.4164930056701617,
"mc2_stderr": 0.013916947335276144
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
diwank/storycommonsense-chatml | 2023-10-08T23:31:38.000Z | [
"region:us"
] | diwank | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: chatml
list:
- name: content
dtype: string
- name: name
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 21803747
num_examples: 14738
download_size: 4868426
dataset_size: 21803747
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "storycommonsense-chatml"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
benxh/us-library-of-congress-subjects | 2023-10-08T23:38:01.000Z | [
"region:us"
] | benxh | null | null | null | 0 | 0 | Just all the subjects from the US Library of Congress cleaned up into JSONL's, missing metadata.
Grab the latest here: https://id.loc.gov/authorities/subjects.html |
teragron/poems | 2023-10-09T00:02:49.000Z | [
"task_categories:text-generation",
"size_categories:1K<n<10K",
"language:tr",
"license:mit",
"art",
"region:us"
] | teragron | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-generation
language:
- tr
tags:
- art
pretty_name: poemmi
size_categories:
- 1K<n<10K
--- |
Zerenidel/DG_Simple | 2023-10-09T03:16:36.000Z | [
"region:us"
] | Zerenidel | null | null | null | 0 | 0 | Entry not found |
Ar4ikov/celebA_spoof | 2023-10-09T03:25:23.000Z | [
"region:us"
] | Ar4ikov | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: valid
path: data/valid-*
- split: test
path: data/test-*
dataset_info:
features:
- name: Filepath
dtype: image
- name: Bbox
sequence: int64
- name: Class
dtype: string
splits:
- name: train
num_bytes: 46432284811.335
num_examples: 419935
- name: valid
num_bytes: 4163631829.316
num_examples: 46738
- name: test
num_bytes: 32416692607.675
num_examples: 59191
download_size: 72011056582
dataset_size: 83012609248.326
---
# Dataset Card for "celebA_spoof"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Monkaro/Starsmitten | 2023-10-09T01:04:02.000Z | [
"region:us"
] | Monkaro | null | null | null | 0 | 0 | Entry not found |
nlplabtdtu/edu-crawl-with-date | 2023-10-09T01:49:10.000Z | [
"region:us"
] | nlplabtdtu | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: title
dtype: string
- name: url
dtype: string
- name: body
dtype: string
- name: date
dtype: string
- name: flt_dates
sequence: string
splits:
- name: train
num_bytes: 1070649713
num_examples: 278902
download_size: 387393861
dataset_size: 1070649713
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "edu-crawl-with-date"
Data crawl education với dữ liệu thời gian (tháng/năm)
Dữ liệu thời gian được cập nhật theo cách sau:
- chiết xuất từ văn bản
- crawl lại một số trang (hiếm)
Hiện tại có: 190692 dòng có dữ liệu thời gian ~= 68.37 %
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_s1ghhh__medllama-2-70b-qlora-1.1 | 2023-10-09T01:35:50.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of s1ghhh/medllama-2-70b-qlora-1.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [s1ghhh/medllama-2-70b-qlora-1.1](https://huggingface.co/s1ghhh/medllama-2-70b-qlora-1.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_s1ghhh__medllama-2-70b-qlora-1.1\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-09T01:34:27.623935](https://huggingface.co/datasets/open-llm-leaderboard/details_s1ghhh__medllama-2-70b-qlora-1.1/blob/main/results_2023-10-09T01-34-27.623935.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.708690489375333,\n\
\ \"acc_stderr\": 0.030558651569776513,\n \"acc_norm\": 0.712790648817897,\n\
\ \"acc_norm_stderr\": 0.030527861535455017,\n \"mc1\": 0.3635250917992656,\n\
\ \"mc1_stderr\": 0.016838862883965838,\n \"mc2\": 0.5240600477973963,\n\
\ \"mc2_stderr\": 0.014676584418259612\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.643344709897611,\n \"acc_stderr\": 0.013998056902620197,\n\
\ \"acc_norm\": 0.6902730375426621,\n \"acc_norm_stderr\": 0.013512058415238361\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6767576180043816,\n\
\ \"acc_stderr\": 0.004667585072717502,\n \"acc_norm\": 0.8717386974706234,\n\
\ \"acc_norm_stderr\": 0.003336971535131172\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742399,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742399\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.03317672787533157,\n\
\ \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.03317672787533157\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.76,\n\
\ \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7622641509433963,\n \"acc_stderr\": 0.026199808807561918,\n\
\ \"acc_norm\": 0.7622641509433963,\n \"acc_norm_stderr\": 0.026199808807561918\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8125,\n\
\ \"acc_stderr\": 0.032639560491693344,\n \"acc_norm\": 0.8125,\n\
\ \"acc_norm_stderr\": 0.032639560491693344\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\"\
: 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.0498887651569859,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.0498887651569859\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7052023121387283,\n\
\ \"acc_stderr\": 0.03476599607516477,\n \"acc_norm\": 0.7052023121387283,\n\
\ \"acc_norm_stderr\": 0.03476599607516477\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.04724007352383888,\n\
\ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.04724007352383888\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.676595744680851,\n \"acc_stderr\": 0.030579442773610334,\n\
\ \"acc_norm\": 0.676595744680851,\n \"acc_norm_stderr\": 0.030579442773610334\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n\
\ \"acc_stderr\": 0.046854730419077895,\n \"acc_norm\": 0.45614035087719296,\n\
\ \"acc_norm_stderr\": 0.046854730419077895\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6413793103448275,\n \"acc_stderr\": 0.03996629574876719,\n\
\ \"acc_norm\": 0.6413793103448275,\n \"acc_norm_stderr\": 0.03996629574876719\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4444444444444444,\n \"acc_stderr\": 0.025591857761382182,\n \"\
acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.025591857761382182\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n\
\ \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n\
\ \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8258064516129032,\n\
\ \"acc_stderr\": 0.021576248184514587,\n \"acc_norm\": 0.8258064516129032,\n\
\ \"acc_norm_stderr\": 0.021576248184514587\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5369458128078818,\n \"acc_stderr\": 0.035083705204426656,\n\
\ \"acc_norm\": 0.5369458128078818,\n \"acc_norm_stderr\": 0.035083705204426656\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8484848484848485,\n \"acc_stderr\": 0.027998073798781678,\n\
\ \"acc_norm\": 0.8484848484848485,\n \"acc_norm_stderr\": 0.027998073798781678\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.898989898989899,\n \"acc_stderr\": 0.021469735576055346,\n \"\
acc_norm\": 0.898989898989899,\n \"acc_norm_stderr\": 0.021469735576055346\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9430051813471503,\n \"acc_stderr\": 0.016731085293607548,\n\
\ \"acc_norm\": 0.9430051813471503,\n \"acc_norm_stderr\": 0.016731085293607548\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7307692307692307,\n \"acc_stderr\": 0.022489389793654817,\n\
\ \"acc_norm\": 0.7307692307692307,\n \"acc_norm_stderr\": 0.022489389793654817\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.37037037037037035,\n \"acc_stderr\": 0.02944316932303154,\n \
\ \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.02944316932303154\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.8025210084033614,\n \"acc_stderr\": 0.02585916412205146,\n \
\ \"acc_norm\": 0.8025210084033614,\n \"acc_norm_stderr\": 0.02585916412205146\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4370860927152318,\n \"acc_stderr\": 0.04050035722230636,\n \"\
acc_norm\": 0.4370860927152318,\n \"acc_norm_stderr\": 0.04050035722230636\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9045871559633027,\n \"acc_stderr\": 0.012595899282335793,\n \"\
acc_norm\": 0.9045871559633027,\n \"acc_norm_stderr\": 0.012595899282335793\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6712962962962963,\n \"acc_stderr\": 0.032036140846700596,\n \"\
acc_norm\": 0.6712962962962963,\n \"acc_norm_stderr\": 0.032036140846700596\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9264705882352942,\n \"acc_stderr\": 0.01831885585008968,\n \"\
acc_norm\": 0.9264705882352942,\n \"acc_norm_stderr\": 0.01831885585008968\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8987341772151899,\n \"acc_stderr\": 0.019637720526065494,\n \
\ \"acc_norm\": 0.8987341772151899,\n \"acc_norm_stderr\": 0.019637720526065494\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7892376681614349,\n\
\ \"acc_stderr\": 0.02737309550054019,\n \"acc_norm\": 0.7892376681614349,\n\
\ \"acc_norm_stderr\": 0.02737309550054019\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8854961832061069,\n \"acc_stderr\": 0.027927473753597446,\n\
\ \"acc_norm\": 0.8854961832061069,\n \"acc_norm_stderr\": 0.027927473753597446\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8760330578512396,\n \"acc_stderr\": 0.030083098716035216,\n \"\
acc_norm\": 0.8760330578512396,\n \"acc_norm_stderr\": 0.030083098716035216\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n\
\ \"acc_stderr\": 0.03755265865037182,\n \"acc_norm\": 0.8148148148148148,\n\
\ \"acc_norm_stderr\": 0.03755265865037182\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8098159509202454,\n \"acc_stderr\": 0.03083349114628124,\n\
\ \"acc_norm\": 0.8098159509202454,\n \"acc_norm_stderr\": 0.03083349114628124\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5446428571428571,\n\
\ \"acc_stderr\": 0.04726835553719097,\n \"acc_norm\": 0.5446428571428571,\n\
\ \"acc_norm_stderr\": 0.04726835553719097\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.0376017800602662,\n\
\ \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.0376017800602662\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9358974358974359,\n\
\ \"acc_stderr\": 0.01604626163167314,\n \"acc_norm\": 0.9358974358974359,\n\
\ \"acc_norm_stderr\": 0.01604626163167314\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \
\ \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165065\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8697318007662835,\n\
\ \"acc_stderr\": 0.012036729568216055,\n \"acc_norm\": 0.8697318007662835,\n\
\ \"acc_norm_stderr\": 0.012036729568216055\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.8179190751445087,\n \"acc_stderr\": 0.020776761102512965,\n\
\ \"acc_norm\": 0.8179190751445087,\n \"acc_norm_stderr\": 0.020776761102512965\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4960893854748603,\n\
\ \"acc_stderr\": 0.016721990073156657,\n \"acc_norm\": 0.4960893854748603,\n\
\ \"acc_norm_stderr\": 0.016721990073156657\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7712418300653595,\n \"acc_stderr\": 0.024051029739912255,\n\
\ \"acc_norm\": 0.7712418300653595,\n \"acc_norm_stderr\": 0.024051029739912255\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8006430868167203,\n\
\ \"acc_stderr\": 0.022691033780549656,\n \"acc_norm\": 0.8006430868167203,\n\
\ \"acc_norm_stderr\": 0.022691033780549656\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8611111111111112,\n \"acc_stderr\": 0.019242526226544543,\n\
\ \"acc_norm\": 0.8611111111111112,\n \"acc_norm_stderr\": 0.019242526226544543\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5780141843971631,\n \"acc_stderr\": 0.029462189233370593,\n \
\ \"acc_norm\": 0.5780141843971631,\n \"acc_norm_stderr\": 0.029462189233370593\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5730117340286832,\n\
\ \"acc_stderr\": 0.012633353557534414,\n \"acc_norm\": 0.5730117340286832,\n\
\ \"acc_norm_stderr\": 0.012633353557534414\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.8014705882352942,\n \"acc_stderr\": 0.0242310133705411,\n\
\ \"acc_norm\": 0.8014705882352942,\n \"acc_norm_stderr\": 0.0242310133705411\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7712418300653595,\n \"acc_stderr\": 0.016992723465466215,\n \
\ \"acc_norm\": 0.7712418300653595,\n \"acc_norm_stderr\": 0.016992723465466215\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7363636363636363,\n\
\ \"acc_stderr\": 0.04220224692971987,\n \"acc_norm\": 0.7363636363636363,\n\
\ \"acc_norm_stderr\": 0.04220224692971987\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7918367346938775,\n \"acc_stderr\": 0.025991117672813296,\n\
\ \"acc_norm\": 0.7918367346938775,\n \"acc_norm_stderr\": 0.025991117672813296\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8955223880597015,\n\
\ \"acc_stderr\": 0.021628920516700643,\n \"acc_norm\": 0.8955223880597015,\n\
\ \"acc_norm_stderr\": 0.021628920516700643\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.92,\n \"acc_stderr\": 0.0272659924344291,\n \
\ \"acc_norm\": 0.92,\n \"acc_norm_stderr\": 0.0272659924344291\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8596491228070176,\n \"acc_stderr\": 0.0266405825391332,\n\
\ \"acc_norm\": 0.8596491228070176,\n \"acc_norm_stderr\": 0.0266405825391332\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3635250917992656,\n\
\ \"mc1_stderr\": 0.016838862883965838,\n \"mc2\": 0.5240600477973963,\n\
\ \"mc2_stderr\": 0.014676584418259612\n }\n}\n```"
repo_url: https://huggingface.co/s1ghhh/medllama-2-70b-qlora-1.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_09T01_34_27.623935
path:
- '**/details_harness|arc:challenge|25_2023-10-09T01-34-27.623935.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-09T01-34-27.623935.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_09T01_34_27.623935
path:
- '**/details_harness|hellaswag|10_2023-10-09T01-34-27.623935.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-09T01-34-27.623935.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_09T01_34_27.623935
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T01-34-27.623935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T01-34-27.623935.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T01-34-27.623935.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_09T01_34_27.623935
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T01-34-27.623935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T01-34-27.623935.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_09T01_34_27.623935
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T01-34-27.623935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T01-34-27.623935.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_09T01_34_27.623935
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T01-34-27.623935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T01-34-27.623935.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_09T01_34_27.623935
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T01-34-27.623935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T01-34-27.623935.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_09T01_34_27.623935
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T01-34-27.623935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T01-34-27.623935.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_09T01_34_27.623935
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T01-34-27.623935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T01-34-27.623935.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_09T01_34_27.623935
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T01-34-27.623935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T01-34-27.623935.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_09T01_34_27.623935
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T01-34-27.623935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T01-34-27.623935.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_09T01_34_27.623935
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T01-34-27.623935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T01-34-27.623935.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_09T01_34_27.623935
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T01-34-27.623935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T01-34-27.623935.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_09T01_34_27.623935
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T01-34-27.623935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T01-34-27.623935.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_09T01_34_27.623935
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T01-34-27.623935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T01-34-27.623935.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_09T01_34_27.623935
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T01-34-27.623935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T01-34-27.623935.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_09T01_34_27.623935
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T01-34-27.623935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T01-34-27.623935.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_09T01_34_27.623935
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T01-34-27.623935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T01-34-27.623935.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_09T01_34_27.623935
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T01-34-27.623935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T01-34-27.623935.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_09T01_34_27.623935
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T01-34-27.623935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T01-34-27.623935.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_09T01_34_27.623935
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T01-34-27.623935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T01-34-27.623935.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_09T01_34_27.623935
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T01-34-27.623935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T01-34-27.623935.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_09T01_34_27.623935
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T01-34-27.623935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T01-34-27.623935.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_09T01_34_27.623935
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T01-34-27.623935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T01-34-27.623935.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_09T01_34_27.623935
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T01-34-27.623935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T01-34-27.623935.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_09T01_34_27.623935
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T01-34-27.623935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T01-34-27.623935.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_09T01_34_27.623935
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T01-34-27.623935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T01-34-27.623935.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_09T01_34_27.623935
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T01-34-27.623935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T01-34-27.623935.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_09T01_34_27.623935
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T01-34-27.623935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T01-34-27.623935.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_09T01_34_27.623935
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T01-34-27.623935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T01-34-27.623935.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_09T01_34_27.623935
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T01-34-27.623935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T01-34-27.623935.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_09T01_34_27.623935
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T01-34-27.623935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T01-34-27.623935.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_09T01_34_27.623935
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T01-34-27.623935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T01-34-27.623935.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_09T01_34_27.623935
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T01-34-27.623935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T01-34-27.623935.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_09T01_34_27.623935
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T01-34-27.623935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T01-34-27.623935.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_09T01_34_27.623935
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T01-34-27.623935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T01-34-27.623935.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_09T01_34_27.623935
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T01-34-27.623935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T01-34-27.623935.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_09T01_34_27.623935
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T01-34-27.623935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T01-34-27.623935.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_09T01_34_27.623935
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T01-34-27.623935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T01-34-27.623935.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_09T01_34_27.623935
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T01-34-27.623935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T01-34-27.623935.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_09T01_34_27.623935
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T01-34-27.623935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T01-34-27.623935.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_09T01_34_27.623935
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T01-34-27.623935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T01-34-27.623935.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_09T01_34_27.623935
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T01-34-27.623935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T01-34-27.623935.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_09T01_34_27.623935
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T01-34-27.623935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T01-34-27.623935.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_09T01_34_27.623935
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T01-34-27.623935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T01-34-27.623935.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_09T01_34_27.623935
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T01-34-27.623935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T01-34-27.623935.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_09T01_34_27.623935
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T01-34-27.623935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T01-34-27.623935.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_09T01_34_27.623935
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T01-34-27.623935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T01-34-27.623935.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_09T01_34_27.623935
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T01-34-27.623935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T01-34-27.623935.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_09T01_34_27.623935
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T01-34-27.623935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T01-34-27.623935.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_09T01_34_27.623935
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T01-34-27.623935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T01-34-27.623935.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_09T01_34_27.623935
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T01-34-27.623935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T01-34-27.623935.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_09T01_34_27.623935
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T01-34-27.623935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T01-34-27.623935.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_09T01_34_27.623935
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T01-34-27.623935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T01-34-27.623935.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_09T01_34_27.623935
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T01-34-27.623935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T01-34-27.623935.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_09T01_34_27.623935
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T01-34-27.623935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T01-34-27.623935.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_09T01_34_27.623935
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T01-34-27.623935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T01-34-27.623935.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_09T01_34_27.623935
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T01-34-27.623935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T01-34-27.623935.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_09T01_34_27.623935
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T01-34-27.623935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T01-34-27.623935.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_09T01_34_27.623935
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T01-34-27.623935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T01-34-27.623935.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_09T01_34_27.623935
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T01-34-27.623935.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T01-34-27.623935.parquet'
- config_name: results
data_files:
- split: 2023_10_09T01_34_27.623935
path:
- results_2023-10-09T01-34-27.623935.parquet
- split: latest
path:
- results_2023-10-09T01-34-27.623935.parquet
---
# Dataset Card for Evaluation run of s1ghhh/medllama-2-70b-qlora-1.1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/s1ghhh/medllama-2-70b-qlora-1.1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [s1ghhh/medllama-2-70b-qlora-1.1](https://huggingface.co/s1ghhh/medllama-2-70b-qlora-1.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_s1ghhh__medllama-2-70b-qlora-1.1",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-09T01:34:27.623935](https://huggingface.co/datasets/open-llm-leaderboard/details_s1ghhh__medllama-2-70b-qlora-1.1/blob/main/results_2023-10-09T01-34-27.623935.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.708690489375333,
"acc_stderr": 0.030558651569776513,
"acc_norm": 0.712790648817897,
"acc_norm_stderr": 0.030527861535455017,
"mc1": 0.3635250917992656,
"mc1_stderr": 0.016838862883965838,
"mc2": 0.5240600477973963,
"mc2_stderr": 0.014676584418259612
},
"harness|arc:challenge|25": {
"acc": 0.643344709897611,
"acc_stderr": 0.013998056902620197,
"acc_norm": 0.6902730375426621,
"acc_norm_stderr": 0.013512058415238361
},
"harness|hellaswag|10": {
"acc": 0.6767576180043816,
"acc_stderr": 0.004667585072717502,
"acc_norm": 0.8717386974706234,
"acc_norm_stderr": 0.003336971535131172
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742399,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742399
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.03317672787533157,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.03317672787533157
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7622641509433963,
"acc_stderr": 0.026199808807561918,
"acc_norm": 0.7622641509433963,
"acc_norm_stderr": 0.026199808807561918
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8125,
"acc_stderr": 0.032639560491693344,
"acc_norm": 0.8125,
"acc_norm_stderr": 0.032639560491693344
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.44,
"acc_stderr": 0.0498887651569859,
"acc_norm": 0.44,
"acc_norm_stderr": 0.0498887651569859
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7052023121387283,
"acc_stderr": 0.03476599607516477,
"acc_norm": 0.7052023121387283,
"acc_norm_stderr": 0.03476599607516477
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.04724007352383888,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.04724007352383888
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.74,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.676595744680851,
"acc_stderr": 0.030579442773610334,
"acc_norm": 0.676595744680851,
"acc_norm_stderr": 0.030579442773610334
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.046854730419077895,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.046854730419077895
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6413793103448275,
"acc_stderr": 0.03996629574876719,
"acc_norm": 0.6413793103448275,
"acc_norm_stderr": 0.03996629574876719
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.025591857761382182,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.025591857761382182
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8258064516129032,
"acc_stderr": 0.021576248184514587,
"acc_norm": 0.8258064516129032,
"acc_norm_stderr": 0.021576248184514587
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5369458128078818,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.5369458128078818,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8484848484848485,
"acc_stderr": 0.027998073798781678,
"acc_norm": 0.8484848484848485,
"acc_norm_stderr": 0.027998073798781678
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.898989898989899,
"acc_stderr": 0.021469735576055346,
"acc_norm": 0.898989898989899,
"acc_norm_stderr": 0.021469735576055346
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9430051813471503,
"acc_stderr": 0.016731085293607548,
"acc_norm": 0.9430051813471503,
"acc_norm_stderr": 0.016731085293607548
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7307692307692307,
"acc_stderr": 0.022489389793654817,
"acc_norm": 0.7307692307692307,
"acc_norm_stderr": 0.022489389793654817
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.02944316932303154,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.02944316932303154
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8025210084033614,
"acc_stderr": 0.02585916412205146,
"acc_norm": 0.8025210084033614,
"acc_norm_stderr": 0.02585916412205146
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4370860927152318,
"acc_stderr": 0.04050035722230636,
"acc_norm": 0.4370860927152318,
"acc_norm_stderr": 0.04050035722230636
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9045871559633027,
"acc_stderr": 0.012595899282335793,
"acc_norm": 0.9045871559633027,
"acc_norm_stderr": 0.012595899282335793
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6712962962962963,
"acc_stderr": 0.032036140846700596,
"acc_norm": 0.6712962962962963,
"acc_norm_stderr": 0.032036140846700596
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9264705882352942,
"acc_stderr": 0.01831885585008968,
"acc_norm": 0.9264705882352942,
"acc_norm_stderr": 0.01831885585008968
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8987341772151899,
"acc_stderr": 0.019637720526065494,
"acc_norm": 0.8987341772151899,
"acc_norm_stderr": 0.019637720526065494
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7892376681614349,
"acc_stderr": 0.02737309550054019,
"acc_norm": 0.7892376681614349,
"acc_norm_stderr": 0.02737309550054019
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8854961832061069,
"acc_stderr": 0.027927473753597446,
"acc_norm": 0.8854961832061069,
"acc_norm_stderr": 0.027927473753597446
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8760330578512396,
"acc_stderr": 0.030083098716035216,
"acc_norm": 0.8760330578512396,
"acc_norm_stderr": 0.030083098716035216
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037182,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037182
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8098159509202454,
"acc_stderr": 0.03083349114628124,
"acc_norm": 0.8098159509202454,
"acc_norm_stderr": 0.03083349114628124
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5446428571428571,
"acc_stderr": 0.04726835553719097,
"acc_norm": 0.5446428571428571,
"acc_norm_stderr": 0.04726835553719097
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.0376017800602662,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.0376017800602662
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9358974358974359,
"acc_stderr": 0.01604626163167314,
"acc_norm": 0.9358974358974359,
"acc_norm_stderr": 0.01604626163167314
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8697318007662835,
"acc_stderr": 0.012036729568216055,
"acc_norm": 0.8697318007662835,
"acc_norm_stderr": 0.012036729568216055
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8179190751445087,
"acc_stderr": 0.020776761102512965,
"acc_norm": 0.8179190751445087,
"acc_norm_stderr": 0.020776761102512965
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4960893854748603,
"acc_stderr": 0.016721990073156657,
"acc_norm": 0.4960893854748603,
"acc_norm_stderr": 0.016721990073156657
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7712418300653595,
"acc_stderr": 0.024051029739912255,
"acc_norm": 0.7712418300653595,
"acc_norm_stderr": 0.024051029739912255
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8006430868167203,
"acc_stderr": 0.022691033780549656,
"acc_norm": 0.8006430868167203,
"acc_norm_stderr": 0.022691033780549656
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8611111111111112,
"acc_stderr": 0.019242526226544543,
"acc_norm": 0.8611111111111112,
"acc_norm_stderr": 0.019242526226544543
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5780141843971631,
"acc_stderr": 0.029462189233370593,
"acc_norm": 0.5780141843971631,
"acc_norm_stderr": 0.029462189233370593
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5730117340286832,
"acc_stderr": 0.012633353557534414,
"acc_norm": 0.5730117340286832,
"acc_norm_stderr": 0.012633353557534414
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8014705882352942,
"acc_stderr": 0.0242310133705411,
"acc_norm": 0.8014705882352942,
"acc_norm_stderr": 0.0242310133705411
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7712418300653595,
"acc_stderr": 0.016992723465466215,
"acc_norm": 0.7712418300653595,
"acc_norm_stderr": 0.016992723465466215
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7363636363636363,
"acc_stderr": 0.04220224692971987,
"acc_norm": 0.7363636363636363,
"acc_norm_stderr": 0.04220224692971987
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7918367346938775,
"acc_stderr": 0.025991117672813296,
"acc_norm": 0.7918367346938775,
"acc_norm_stderr": 0.025991117672813296
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8955223880597015,
"acc_stderr": 0.021628920516700643,
"acc_norm": 0.8955223880597015,
"acc_norm_stderr": 0.021628920516700643
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.92,
"acc_stderr": 0.0272659924344291,
"acc_norm": 0.92,
"acc_norm_stderr": 0.0272659924344291
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8596491228070176,
"acc_stderr": 0.0266405825391332,
"acc_norm": 0.8596491228070176,
"acc_norm_stderr": 0.0266405825391332
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3635250917992656,
"mc1_stderr": 0.016838862883965838,
"mc2": 0.5240600477973963,
"mc2_stderr": 0.014676584418259612
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
HSL-MJ/llm60 | 2023-10-09T01:53:20.000Z | [
"license:apache-2.0",
"region:us"
] | HSL-MJ | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
adam89/TinyStoriesChinese | 2023-10-10T08:18:25.000Z | [
"license:cdla-sharing-1.0",
"region:us"
] | adam89 | null | null | null | 1 | 0 | ---
license: cdla-sharing-1.0
---
[TinyStories](https://huggingface.co/datasets/roneneldan/TinyStories)数据集的中文翻译版。只翻译了`story`字段(翻译后字段为`story_zh`):
```json
{
"story": "\n\nLily and Ben are friends. They like to play in the park. One day, they see a big tree with a swing. Lily wants to try the swing. She runs to the tree and climbs on the swing.\n\"Push me, Ben!\" she says. Ben pushes her gently. Lily feels happy. She swings higher and higher. She laughs and shouts.\nBen watches Lily. He thinks she is cute. He wants to swing too. He waits for Lily to stop. But Lily does not stop. She swings faster and faster. She is having too much fun.\n\"Can I swing too, Lily?\" Ben asks. Lily does not hear him. She is too busy swinging. Ben feels sad. He walks away.\nLily swings so high that she loses her grip. She falls off the swing. She lands on the ground. She hurts her foot. She cries.\n\"Ow, ow, ow!\" she says. She looks for Ben. She wants him to help her. But Ben is not there. He is gone.\nLily feels sorry. She wishes she had shared the swing with Ben. She wishes he was there to hug her. She limps to the tree. She sees something hanging from a branch. It is Ben's hat. He left it for her.\nLily smiles. She thinks Ben is nice. She puts on his hat. She hopes he will come back. She wants to say sorry. She wants to be friends again.",
"instruction": {
"prompt:": "Write a short story (3-5 paragraphs) which only uses very simple words that a 3 year old child would understand. The story should use the verb \"hang\", the noun \"foot\" and the adjective \"cute\". The story has the following features: the story should contain at least one dialogue. Remember to only use simple words!\n\nPossible story:",
"words": [
"hang",
"foot",
"cute"
],
"features": [
"Dialogue"
]
},
"summary": "Lily and Ben play in the park and Lily gets too caught up in swinging, causing Ben to leave. Lily falls off the swing and hurts herself, but Ben leaves his hat for her as a kind gesture.",
"source": "GPT-4",
"story_zh": "莉莉和本是朋友。他们喜欢在公园里玩。有一天,他们在一棵大树下看到了一个秋千。莉莉想试试那个秋千。她跑到树下,爬上了秋千。\n\"推我,本!\"她说。本轻轻地推了她一下。莉莉感到很开心。她越荡越高,笑着喊叫。\n本看着莉莉。他觉得她很可爱。他也想荡秋千。他在莉莉停下来之后等着。但是莉莉没有停下来。她越荡越快。她玩得太高兴了。\n\"我也可以荡秋千吗,莉莉?\"本问。莉莉没听到他的话。她忙着荡秋千。本觉得很难过。他走开了。\n莉莉荡得太高,失去了平衡。她从秋千上摔下来,落在地上。她扭伤了脚。她哭了起来。\n\"哎呀,哎呀,哎呀!\"她说。她在找本。她希望他能帮助她。但本不在那里。他走了。\n莉莉感到很抱歉。她希望她能和本分享秋千。她希望他在那里拥抱她。她一瘸一拐地走到树下。她看到有什么东西挂在树枝上。那是本的帽子。他留给她的。\n莉莉笑了。她觉得本很好。她戴上了他的帽子。她希望他会回来。她想道歉。她想再次成为朋友。"
}
```
可以看一下翻译效果:
```text
Lily and Ben are friends. They like to play in the park. One day, they see a big tree with a swing. Lily wants to try the swing. She runs to the tree and climbs on the swing.
"Push me, Ben!" she says. Ben pushes her gently. Lily feels happy. She swings higher and higher. She laughs and shouts.
Ben watches Lily. He thinks she is cute. He wants to swing too. He waits for Lily to stop. But Lily does not stop. She swings faster and faster. She is having too much fun.
"Can I swing too, Lily?" Ben asks. Lily does not hear him. She is too busy swinging. Ben feels sad. He walks away.
Lily swings so high that she loses her grip. She falls off the swing. She lands on the ground. She hurts her foot. She cries.
"Ow, ow, ow!" she says. She looks for Ben. She wants him to help her. But Ben is not there. He is gone.
Lily feels sorry. She wishes she had shared the swing with Ben. She wishes he was there to hug her. She limps to the tree. She sees something hanging from a branch. It is Ben's hat. He left it for her.
Lily smiles. She thinks Ben is nice. She puts on his hat. She hopes he will come back. She wants to say sorry. She wants to be friends again.
莉莉和本是朋友。他们喜欢在公园里玩。有一天,他们在一棵大树下看到了一个秋千。莉莉想试试那个秋千。她跑到树下,爬上了秋千。
"推我,本!"她说。本轻轻地推了她一下。莉莉感到很开心。她越荡越高,笑着喊叫。
本看着莉莉。他觉得她很可爱。他也想荡秋千。他在莉莉停下来之后等着。但是莉莉没有停下来。她越荡越快。她玩得太高兴了。
"我也可以荡秋千吗,莉莉?"本问。莉莉没听到他的话。她忙着荡秋千。本觉得很难过。他走开了。
莉莉荡得太高,失去了平衡。她从秋千上摔下来,落在地上。她扭伤了脚。她哭了起来。
"哎呀,哎呀,哎呀!"她说。她在找本。她希望他能帮助她。但本不在那里。他走了。
莉莉感到很抱歉。她希望她能和本分享秋千。她希望他在那里拥抱她。她一瘸一拐地走到树下。她看到有什么东西挂在树枝上。那是本的帽子。他留给她的。
莉莉笑了。她觉得本很好。她戴上了他的帽子。她希望他会回来。她想道歉。她想再次成为朋友。
```
```text
Once upon a time, there was a little girl named Lily. She had a teddy bear that she loved so much. One day, she lost it while playing in the park. She looked everywhere, but she couldn't find it. She felt sad and scared without her teddy bear.
Lily's mommy saw her crying and asked what was wrong. Lily told her that she lost her teddy bear. Mommy hugged her and said, "Don't worry, we'll search for it together." They went back to the park and looked everywhere. After a while, they found the teddy bear under a tree. Lily was so happy!
She hugged her teddy bear and felt comfortable again. She said, "I hope I never lose you again, teddy bear." Mommy smiled and said, "Me too, Lily. You and teddy bear are the best of friends." And they all went home, happy and content. The end.
从前,有一个小女孩叫莉莉。她非常喜欢她的泰迪熊。有一天,她在公园里玩时把它弄丢了。她找遍了所有地方,但仍然找不到它。没有她的泰迪熊,她感到很难过和害怕。
莉莉的妈妈看到她哭泣,问她发生了什么事。莉莉告诉她自己把泰迪熊弄丢了。妈妈抱住她说:“别担心,我们会一起去找的。”他们回到公园,到处寻找。过了一会儿,他们在树下找到了泰迪熊。莉莉非常高兴!
她拥抱了她的泰迪熊,感觉又舒服了。她说:“我希望我再也不要失去你,泰迪熊。”妈妈笑着说:“我也这么想,莉莉。你和泰迪熊是最好的朋友。”然后他们都高高兴兴地回家了,感到非常满足。结束。
```
```text
Once upon a time, there was a cute puppy named Max. Max was very adorable with his big, brown eyes and wagging tail. One day, Max's owner, Emily, told him that they needed to go to the post office to mail a letter. Max didn't know what that meant, but he was excited to go for a car ride.
At the post office, Emily gave the letter to the nice lady behind the desk. The lady asked Emily for a number and Emily gave her one. Max didn't know what a number was, but he saw the lady type something on the computer.
After they mailed the letter, Emily and Max went back to the car. Max was happy that they went on an adventure and he couldn't wait for the next one.
从前,有一只可爱的狗狗名叫Max。Max 非常可爱,大大的棕色眼睛和摇摆的尾巴都让人喜欢。有一天,Emily告诉Max他们需要去邮局寄一封信。Max并不知道那是什么意思,但他很兴奋能去兜风。
在邮局,Emily把信交给柜台后面友好的女士。女士问Emily要了一个号码,Emily给了她一个。Max并不知道什么是号码,但看到女士在电脑上输入了一些东西。
寄完信后,Emily和Max回到了车里。Max很高兴他们去了一趟冒险,他迫不及待地期待着下一次冒险。
```
```text
One day, a kind and honest cat named Tom found a pretty velvet ribbon. He wanted to hang it on his door. But when he tried to hang it, he saw his friend, a small bird named Sue, was sad.
Sue wanted the velvet ribbon too. Tom did not want to make Sue sad. So, he thought of a way to share the ribbon. He knew that they both liked to play games.
Tom said, "Let's play a game. We can take turns to have the ribbon. Today, I will hang it on my door. Tomorrow, you can hang it on your tree."
Sue liked this idea. They played and shared the velvet ribbon every day. Tom and Sue were both happy. They learned that sharing is a good way to solve problems and stay friends.
一天,一只名叫汤姆的善良诚实的猫发现了一条漂亮的天鹅绒彩带。他想把它挂在门上。但当他尝试挂上时,看到他的朋友,一只名叫苏的小鸟,很伤心。
苏也想要这条天鹅绒彩带。汤姆不想让苏伤心。所以,他想到了一个分享彩带的方法。他知道他们都喜欢玩游戏。
汤姆说:“我们来玩个游戏吧。我们可以轮流拥有这条彩带。今天,我把它挂在我门上。明天,你可以把它挂在你树上。”
苏喜欢这个主意。他们每天都会玩游戏并分享这根天鹅绒彩带。汤姆和苏都很开心。他们学会了分享是一种解决问题的良好方法,也是保持友谊的好方法。
``` |
open-llm-leaderboard/details_pankajmathur__model_007 | 2023-10-09T02:04:34.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of pankajmathur/model_007
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [pankajmathur/model_007](https://huggingface.co/pankajmathur/model_007) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_pankajmathur__model_007\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-09T02:03:09.335068](https://huggingface.co/datasets/open-llm-leaderboard/details_pankajmathur__model_007/blob/main/results_2023-10-09T02-03-09.335068.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6901502879968988,\n\
\ \"acc_stderr\": 0.031344534847114004,\n \"acc_norm\": 0.6939037892141556,\n\
\ \"acc_norm_stderr\": 0.03131458982120537,\n \"mc1\": 0.44920440636474906,\n\
\ \"mc1_stderr\": 0.01741294198611531,\n \"mc2\": 0.6312306236860621,\n\
\ \"mc2_stderr\": 0.014945471343395618\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6749146757679181,\n \"acc_stderr\": 0.01368814730972912,\n\
\ \"acc_norm\": 0.7107508532423208,\n \"acc_norm_stderr\": 0.013250012579393441\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6908982274447322,\n\
\ \"acc_stderr\": 0.004611787665905346,\n \"acc_norm\": 0.8765186217884884,\n\
\ \"acc_norm_stderr\": 0.003283165867631372\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939098,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939098\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8157894736842105,\n \"acc_stderr\": 0.0315469804508223,\n\
\ \"acc_norm\": 0.8157894736842105,\n \"acc_norm_stderr\": 0.0315469804508223\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.76,\n\
\ \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7320754716981132,\n \"acc_stderr\": 0.027257260322494845,\n\
\ \"acc_norm\": 0.7320754716981132,\n \"acc_norm_stderr\": 0.027257260322494845\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8125,\n\
\ \"acc_stderr\": 0.032639560491693344,\n \"acc_norm\": 0.8125,\n\
\ \"acc_norm_stderr\": 0.032639560491693344\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.047840607041056527,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.047840607041056527\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6680851063829787,\n \"acc_stderr\": 0.030783736757745657,\n\
\ \"acc_norm\": 0.6680851063829787,\n \"acc_norm_stderr\": 0.030783736757745657\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n\
\ \"acc_stderr\": 0.046570472605949625,\n \"acc_norm\": 0.4298245614035088,\n\
\ \"acc_norm_stderr\": 0.046570472605949625\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6068965517241379,\n \"acc_stderr\": 0.0407032901370707,\n\
\ \"acc_norm\": 0.6068965517241379,\n \"acc_norm_stderr\": 0.0407032901370707\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.455026455026455,\n \"acc_stderr\": 0.025646928361049398,\n \"\
acc_norm\": 0.455026455026455,\n \"acc_norm_stderr\": 0.025646928361049398\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677173,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677173\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \
\ \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8129032258064516,\n \"acc_stderr\": 0.022185710092252252,\n \"\
acc_norm\": 0.8129032258064516,\n \"acc_norm_stderr\": 0.022185710092252252\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5369458128078818,\n \"acc_stderr\": 0.035083705204426656,\n \"\
acc_norm\": 0.5369458128078818,\n \"acc_norm_stderr\": 0.035083705204426656\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8666666666666667,\n \"acc_stderr\": 0.026544435312706467,\n\
\ \"acc_norm\": 0.8666666666666667,\n \"acc_norm_stderr\": 0.026544435312706467\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8888888888888888,\n \"acc_stderr\": 0.02239078763821678,\n \"\
acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.02239078763821678\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9326424870466321,\n \"acc_stderr\": 0.018088393839078894,\n\
\ \"acc_norm\": 0.9326424870466321,\n \"acc_norm_stderr\": 0.018088393839078894\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7102564102564103,\n \"acc_stderr\": 0.023000628243687968,\n\
\ \"acc_norm\": 0.7102564102564103,\n \"acc_norm_stderr\": 0.023000628243687968\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028597,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028597\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7436974789915967,\n \"acc_stderr\": 0.02835962087053395,\n \
\ \"acc_norm\": 0.7436974789915967,\n \"acc_norm_stderr\": 0.02835962087053395\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4768211920529801,\n \"acc_stderr\": 0.04078093859163083,\n \"\
acc_norm\": 0.4768211920529801,\n \"acc_norm_stderr\": 0.04078093859163083\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8899082568807339,\n \"acc_stderr\": 0.0134199390186812,\n \"acc_norm\"\
: 0.8899082568807339,\n \"acc_norm_stderr\": 0.0134199390186812\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5416666666666666,\n\
\ \"acc_stderr\": 0.03398110890294636,\n \"acc_norm\": 0.5416666666666666,\n\
\ \"acc_norm_stderr\": 0.03398110890294636\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.9215686274509803,\n \"acc_stderr\": 0.018869514646658925,\n\
\ \"acc_norm\": 0.9215686274509803,\n \"acc_norm_stderr\": 0.018869514646658925\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8734177215189873,\n \"acc_stderr\": 0.021644195727955173,\n \
\ \"acc_norm\": 0.8734177215189873,\n \"acc_norm_stderr\": 0.021644195727955173\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7982062780269058,\n\
\ \"acc_stderr\": 0.026936111912802273,\n \"acc_norm\": 0.7982062780269058,\n\
\ \"acc_norm_stderr\": 0.026936111912802273\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.816793893129771,\n \"acc_stderr\": 0.03392770926494733,\n\
\ \"acc_norm\": 0.816793893129771,\n \"acc_norm_stderr\": 0.03392770926494733\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8677685950413223,\n \"acc_stderr\": 0.030922788320445815,\n \"\
acc_norm\": 0.8677685950413223,\n \"acc_norm_stderr\": 0.030922788320445815\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8240740740740741,\n\
\ \"acc_stderr\": 0.036809181416738807,\n \"acc_norm\": 0.8240740740740741,\n\
\ \"acc_norm_stderr\": 0.036809181416738807\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742179,\n\
\ \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742179\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822582,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822582\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9017094017094017,\n\
\ \"acc_stderr\": 0.019503444900757567,\n \"acc_norm\": 0.9017094017094017,\n\
\ \"acc_norm_stderr\": 0.019503444900757567\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8748403575989783,\n\
\ \"acc_stderr\": 0.011832954239305724,\n \"acc_norm\": 0.8748403575989783,\n\
\ \"acc_norm_stderr\": 0.011832954239305724\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7630057803468208,\n \"acc_stderr\": 0.02289408248992599,\n\
\ \"acc_norm\": 0.7630057803468208,\n \"acc_norm_stderr\": 0.02289408248992599\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5497206703910614,\n\
\ \"acc_stderr\": 0.016639615236845817,\n \"acc_norm\": 0.5497206703910614,\n\
\ \"acc_norm_stderr\": 0.016639615236845817\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.02555316999182651,\n\
\ \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.02555316999182651\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7684887459807074,\n\
\ \"acc_stderr\": 0.023956532766639133,\n \"acc_norm\": 0.7684887459807074,\n\
\ \"acc_norm_stderr\": 0.023956532766639133\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.02202136610022019,\n\
\ \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.02202136610022019\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5460992907801419,\n \"acc_stderr\": 0.029700453247291477,\n \
\ \"acc_norm\": 0.5460992907801419,\n \"acc_norm_stderr\": 0.029700453247291477\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.545632333767927,\n\
\ \"acc_stderr\": 0.012716941720734818,\n \"acc_norm\": 0.545632333767927,\n\
\ \"acc_norm_stderr\": 0.012716941720734818\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7316176470588235,\n \"acc_stderr\": 0.026917481224377197,\n\
\ \"acc_norm\": 0.7316176470588235,\n \"acc_norm_stderr\": 0.026917481224377197\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.75,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.75,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.7363636363636363,\n \"acc_stderr\": 0.04220224692971987,\n\
\ \"acc_norm\": 0.7363636363636363,\n \"acc_norm_stderr\": 0.04220224692971987\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7918367346938775,\n\
\ \"acc_stderr\": 0.025991117672813296,\n \"acc_norm\": 0.7918367346938775,\n\
\ \"acc_norm_stderr\": 0.025991117672813296\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.8557213930348259,\n \"acc_stderr\": 0.02484575321230604,\n\
\ \"acc_norm\": 0.8557213930348259,\n \"acc_norm_stderr\": 0.02484575321230604\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \"acc_norm\": 0.86,\n\
\ \"acc_norm_stderr\": 0.03487350880197769\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.0387862677100236,\n\
\ \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.0387862677100236\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8654970760233918,\n\
\ \"acc_stderr\": 0.026168221344662297,\n \"acc_norm\": 0.8654970760233918,\n\
\ \"acc_norm_stderr\": 0.026168221344662297\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.44920440636474906,\n \"mc1_stderr\": 0.01741294198611531,\n\
\ \"mc2\": 0.6312306236860621,\n \"mc2_stderr\": 0.014945471343395618\n\
\ }\n}\n```"
repo_url: https://huggingface.co/pankajmathur/model_007
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|arc:challenge|25_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hellaswag|10_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T02-03-09.335068.parquet'
- config_name: results
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- results_2023-10-09T02-03-09.335068.parquet
- split: latest
path:
- results_2023-10-09T02-03-09.335068.parquet
---
# Dataset Card for Evaluation run of pankajmathur/model_007
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/pankajmathur/model_007
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [pankajmathur/model_007](https://huggingface.co/pankajmathur/model_007) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_pankajmathur__model_007",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-09T02:03:09.335068](https://huggingface.co/datasets/open-llm-leaderboard/details_pankajmathur__model_007/blob/main/results_2023-10-09T02-03-09.335068.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6901502879968988,
"acc_stderr": 0.031344534847114004,
"acc_norm": 0.6939037892141556,
"acc_norm_stderr": 0.03131458982120537,
"mc1": 0.44920440636474906,
"mc1_stderr": 0.01741294198611531,
"mc2": 0.6312306236860621,
"mc2_stderr": 0.014945471343395618
},
"harness|arc:challenge|25": {
"acc": 0.6749146757679181,
"acc_stderr": 0.01368814730972912,
"acc_norm": 0.7107508532423208,
"acc_norm_stderr": 0.013250012579393441
},
"harness|hellaswag|10": {
"acc": 0.6908982274447322,
"acc_stderr": 0.004611787665905346,
"acc_norm": 0.8765186217884884,
"acc_norm_stderr": 0.003283165867631372
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939098,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939098
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8157894736842105,
"acc_stderr": 0.0315469804508223,
"acc_norm": 0.8157894736842105,
"acc_norm_stderr": 0.0315469804508223
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7320754716981132,
"acc_stderr": 0.027257260322494845,
"acc_norm": 0.7320754716981132,
"acc_norm_stderr": 0.027257260322494845
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8125,
"acc_stderr": 0.032639560491693344,
"acc_norm": 0.8125,
"acc_norm_stderr": 0.032639560491693344
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.047840607041056527,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.047840607041056527
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6680851063829787,
"acc_stderr": 0.030783736757745657,
"acc_norm": 0.6680851063829787,
"acc_norm_stderr": 0.030783736757745657
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4298245614035088,
"acc_stderr": 0.046570472605949625,
"acc_norm": 0.4298245614035088,
"acc_norm_stderr": 0.046570472605949625
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6068965517241379,
"acc_stderr": 0.0407032901370707,
"acc_norm": 0.6068965517241379,
"acc_norm_stderr": 0.0407032901370707
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.455026455026455,
"acc_stderr": 0.025646928361049398,
"acc_norm": 0.455026455026455,
"acc_norm_stderr": 0.025646928361049398
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677173,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677173
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8129032258064516,
"acc_stderr": 0.022185710092252252,
"acc_norm": 0.8129032258064516,
"acc_norm_stderr": 0.022185710092252252
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5369458128078818,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.5369458128078818,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8666666666666667,
"acc_stderr": 0.026544435312706467,
"acc_norm": 0.8666666666666667,
"acc_norm_stderr": 0.026544435312706467
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.02239078763821678,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.02239078763821678
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9326424870466321,
"acc_stderr": 0.018088393839078894,
"acc_norm": 0.9326424870466321,
"acc_norm_stderr": 0.018088393839078894
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7102564102564103,
"acc_stderr": 0.023000628243687968,
"acc_norm": 0.7102564102564103,
"acc_norm_stderr": 0.023000628243687968
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028597,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028597
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7436974789915967,
"acc_stderr": 0.02835962087053395,
"acc_norm": 0.7436974789915967,
"acc_norm_stderr": 0.02835962087053395
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4768211920529801,
"acc_stderr": 0.04078093859163083,
"acc_norm": 0.4768211920529801,
"acc_norm_stderr": 0.04078093859163083
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8899082568807339,
"acc_stderr": 0.0134199390186812,
"acc_norm": 0.8899082568807339,
"acc_norm_stderr": 0.0134199390186812
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9215686274509803,
"acc_stderr": 0.018869514646658925,
"acc_norm": 0.9215686274509803,
"acc_norm_stderr": 0.018869514646658925
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8734177215189873,
"acc_stderr": 0.021644195727955173,
"acc_norm": 0.8734177215189873,
"acc_norm_stderr": 0.021644195727955173
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7982062780269058,
"acc_stderr": 0.026936111912802273,
"acc_norm": 0.7982062780269058,
"acc_norm_stderr": 0.026936111912802273
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.816793893129771,
"acc_stderr": 0.03392770926494733,
"acc_norm": 0.816793893129771,
"acc_norm_stderr": 0.03392770926494733
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8677685950413223,
"acc_stderr": 0.030922788320445815,
"acc_norm": 0.8677685950413223,
"acc_norm_stderr": 0.030922788320445815
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8240740740740741,
"acc_stderr": 0.036809181416738807,
"acc_norm": 0.8240740740740741,
"acc_norm_stderr": 0.036809181416738807
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742179,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742179
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822582,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822582
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9017094017094017,
"acc_stderr": 0.019503444900757567,
"acc_norm": 0.9017094017094017,
"acc_norm_stderr": 0.019503444900757567
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8748403575989783,
"acc_stderr": 0.011832954239305724,
"acc_norm": 0.8748403575989783,
"acc_norm_stderr": 0.011832954239305724
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7630057803468208,
"acc_stderr": 0.02289408248992599,
"acc_norm": 0.7630057803468208,
"acc_norm_stderr": 0.02289408248992599
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5497206703910614,
"acc_stderr": 0.016639615236845817,
"acc_norm": 0.5497206703910614,
"acc_norm_stderr": 0.016639615236845817
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.02555316999182651,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.02555316999182651
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7684887459807074,
"acc_stderr": 0.023956532766639133,
"acc_norm": 0.7684887459807074,
"acc_norm_stderr": 0.023956532766639133
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.02202136610022019,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.02202136610022019
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5460992907801419,
"acc_stderr": 0.029700453247291477,
"acc_norm": 0.5460992907801419,
"acc_norm_stderr": 0.029700453247291477
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.545632333767927,
"acc_stderr": 0.012716941720734818,
"acc_norm": 0.545632333767927,
"acc_norm_stderr": 0.012716941720734818
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7316176470588235,
"acc_stderr": 0.026917481224377197,
"acc_norm": 0.7316176470588235,
"acc_norm_stderr": 0.026917481224377197
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.75,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.75,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7363636363636363,
"acc_stderr": 0.04220224692971987,
"acc_norm": 0.7363636363636363,
"acc_norm_stderr": 0.04220224692971987
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7918367346938775,
"acc_stderr": 0.025991117672813296,
"acc_norm": 0.7918367346938775,
"acc_norm_stderr": 0.025991117672813296
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.02484575321230604,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.02484575321230604
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197769,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197769
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8654970760233918,
"acc_stderr": 0.026168221344662297,
"acc_norm": 0.8654970760233918,
"acc_norm_stderr": 0.026168221344662297
},
"harness|truthfulqa:mc|0": {
"mc1": 0.44920440636474906,
"mc1_stderr": 0.01741294198611531,
"mc2": 0.6312306236860621,
"mc2_stderr": 0.014945471343395618
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Mizukiluke/ureader-instruction-1.0 | 2023-10-09T09:51:17.000Z | [
"region:us"
] | Mizukiluke | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_pankajmathur__orca_mini_v3_70b | 2023-10-09T02:13:29.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of pankajmathur/orca_mini_v3_70b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [pankajmathur/orca_mini_v3_70b](https://huggingface.co/pankajmathur/orca_mini_v3_70b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_pankajmathur__orca_mini_v3_70b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-09T02:12:05.216705](https://huggingface.co/datasets/open-llm-leaderboard/details_pankajmathur__orca_mini_v3_70b/blob/main/results_2023-10-09T02-12-05.216705.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7010508529623596,\n\
\ \"acc_stderr\": 0.0309286120388273,\n \"acc_norm\": 0.7049679984523141,\n\
\ \"acc_norm_stderr\": 0.030896356315399304,\n \"mc1\": 0.42962056303549573,\n\
\ \"mc1_stderr\": 0.017329234580409098,\n \"mc2\": 0.6126968953087459,\n\
\ \"mc2_stderr\": 0.015087648780065216\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6646757679180887,\n \"acc_stderr\": 0.013796182947785562,\n\
\ \"acc_norm\": 0.712457337883959,\n \"acc_norm_stderr\": 0.013226719056266129\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6951802429794861,\n\
\ \"acc_stderr\": 0.00459390260197934,\n \"acc_norm\": 0.8785102569209321,\n\
\ \"acc_norm_stderr\": 0.0032602788112468337\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.04171654161354543,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.04171654161354543\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8157894736842105,\n \"acc_stderr\": 0.0315469804508223,\n\
\ \"acc_norm\": 0.8157894736842105,\n \"acc_norm_stderr\": 0.0315469804508223\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.77,\n\
\ \"acc_stderr\": 0.042295258468165044,\n \"acc_norm\": 0.77,\n \
\ \"acc_norm_stderr\": 0.042295258468165044\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7396226415094339,\n \"acc_stderr\": 0.027008766090708052,\n\
\ \"acc_norm\": 0.7396226415094339,\n \"acc_norm_stderr\": 0.027008766090708052\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8125,\n\
\ \"acc_stderr\": 0.032639560491693344,\n \"acc_norm\": 0.8125,\n\
\ \"acc_norm_stderr\": 0.032639560491693344\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\
\ \"acc_stderr\": 0.036146654241808254,\n \"acc_norm\": 0.6589595375722543,\n\
\ \"acc_norm_stderr\": 0.036146654241808254\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6936170212765957,\n \"acc_stderr\": 0.03013590647851756,\n\
\ \"acc_norm\": 0.6936170212765957,\n \"acc_norm_stderr\": 0.03013590647851756\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n\
\ \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n\
\ \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6344827586206897,\n \"acc_stderr\": 0.040131241954243856,\n\
\ \"acc_norm\": 0.6344827586206897,\n \"acc_norm_stderr\": 0.040131241954243856\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.48148148148148145,\n \"acc_stderr\": 0.02573364199183898,\n \"\
acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.02573364199183898\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8161290322580645,\n \"acc_stderr\": 0.02203721734026783,\n \"\
acc_norm\": 0.8161290322580645,\n \"acc_norm_stderr\": 0.02203721734026783\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5665024630541872,\n \"acc_stderr\": 0.034867317274198714,\n \"\
acc_norm\": 0.5665024630541872,\n \"acc_norm_stderr\": 0.034867317274198714\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932262,\n \"acc_norm\"\
: 0.78,\n \"acc_norm_stderr\": 0.04163331998932262\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8424242424242424,\n \"acc_stderr\": 0.02845038880528436,\n\
\ \"acc_norm\": 0.8424242424242424,\n \"acc_norm_stderr\": 0.02845038880528436\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8737373737373737,\n \"acc_stderr\": 0.023664359402880232,\n \"\
acc_norm\": 0.8737373737373737,\n \"acc_norm_stderr\": 0.023664359402880232\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9326424870466321,\n \"acc_stderr\": 0.0180883938390789,\n\
\ \"acc_norm\": 0.9326424870466321,\n \"acc_norm_stderr\": 0.0180883938390789\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7025641025641025,\n \"acc_stderr\": 0.023177408131465942,\n\
\ \"acc_norm\": 0.7025641025641025,\n \"acc_norm_stderr\": 0.023177408131465942\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3037037037037037,\n \"acc_stderr\": 0.028037929969114986,\n \
\ \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.028037929969114986\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7478991596638656,\n \"acc_stderr\": 0.028205545033277723,\n\
\ \"acc_norm\": 0.7478991596638656,\n \"acc_norm_stderr\": 0.028205545033277723\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.5033112582781457,\n \"acc_stderr\": 0.04082393379449654,\n \"\
acc_norm\": 0.5033112582781457,\n \"acc_norm_stderr\": 0.04082393379449654\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9009174311926605,\n \"acc_stderr\": 0.01280978008187893,\n \"\
acc_norm\": 0.9009174311926605,\n \"acc_norm_stderr\": 0.01280978008187893\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5879629629629629,\n \"acc_stderr\": 0.03356787758160831,\n \"\
acc_norm\": 0.5879629629629629,\n \"acc_norm_stderr\": 0.03356787758160831\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9264705882352942,\n \"acc_stderr\": 0.018318855850089678,\n \"\
acc_norm\": 0.9264705882352942,\n \"acc_norm_stderr\": 0.018318855850089678\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.9029535864978903,\n \"acc_stderr\": 0.019269323025640255,\n \
\ \"acc_norm\": 0.9029535864978903,\n \"acc_norm_stderr\": 0.019269323025640255\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7757847533632287,\n\
\ \"acc_stderr\": 0.027991534258519513,\n \"acc_norm\": 0.7757847533632287,\n\
\ \"acc_norm_stderr\": 0.027991534258519513\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8625954198473282,\n \"acc_stderr\": 0.030194823996804475,\n\
\ \"acc_norm\": 0.8625954198473282,\n \"acc_norm_stderr\": 0.030194823996804475\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8760330578512396,\n \"acc_stderr\": 0.030083098716035202,\n \"\
acc_norm\": 0.8760330578512396,\n \"acc_norm_stderr\": 0.030083098716035202\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8240740740740741,\n\
\ \"acc_stderr\": 0.036809181416738807,\n \"acc_norm\": 0.8240740740740741,\n\
\ \"acc_norm_stderr\": 0.036809181416738807\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8159509202453987,\n \"acc_stderr\": 0.030446777687971726,\n\
\ \"acc_norm\": 0.8159509202453987,\n \"acc_norm_stderr\": 0.030446777687971726\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.5178571428571429,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9145299145299145,\n\
\ \"acc_stderr\": 0.01831589168562585,\n \"acc_norm\": 0.9145299145299145,\n\
\ \"acc_norm_stderr\": 0.01831589168562585\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8748403575989783,\n\
\ \"acc_stderr\": 0.011832954239305733,\n \"acc_norm\": 0.8748403575989783,\n\
\ \"acc_norm_stderr\": 0.011832954239305733\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.791907514450867,\n \"acc_stderr\": 0.0218552552634218,\n\
\ \"acc_norm\": 0.791907514450867,\n \"acc_norm_stderr\": 0.0218552552634218\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5575418994413408,\n\
\ \"acc_stderr\": 0.01661139368726857,\n \"acc_norm\": 0.5575418994413408,\n\
\ \"acc_norm_stderr\": 0.01661139368726857\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7418300653594772,\n \"acc_stderr\": 0.025058503316958157,\n\
\ \"acc_norm\": 0.7418300653594772,\n \"acc_norm_stderr\": 0.025058503316958157\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.77491961414791,\n\
\ \"acc_stderr\": 0.023720088516179027,\n \"acc_norm\": 0.77491961414791,\n\
\ \"acc_norm_stderr\": 0.023720088516179027\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8209876543209876,\n \"acc_stderr\": 0.021330868762127062,\n\
\ \"acc_norm\": 0.8209876543209876,\n \"acc_norm_stderr\": 0.021330868762127062\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5851063829787234,\n \"acc_stderr\": 0.0293922365846125,\n \
\ \"acc_norm\": 0.5851063829787234,\n \"acc_norm_stderr\": 0.0293922365846125\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.559973924380704,\n\
\ \"acc_stderr\": 0.012678037478574513,\n \"acc_norm\": 0.559973924380704,\n\
\ \"acc_norm_stderr\": 0.012678037478574513\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7169117647058824,\n \"acc_stderr\": 0.02736586113151381,\n\
\ \"acc_norm\": 0.7169117647058824,\n \"acc_norm_stderr\": 0.02736586113151381\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7549019607843137,\n \"acc_stderr\": 0.017401816711427653,\n \
\ \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.017401816711427653\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n\
\ \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.7181818181818181,\n\
\ \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.02560737598657916,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.02560737598657916\n },\n\
\ \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8756218905472637,\n\
\ \"acc_stderr\": 0.023335401790166327,\n \"acc_norm\": 0.8756218905472637,\n\
\ \"acc_norm_stderr\": 0.023335401790166327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466125,\n \
\ \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466125\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8654970760233918,\n \"acc_stderr\": 0.026168221344662297,\n\
\ \"acc_norm\": 0.8654970760233918,\n \"acc_norm_stderr\": 0.026168221344662297\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.42962056303549573,\n\
\ \"mc1_stderr\": 0.017329234580409098,\n \"mc2\": 0.6126968953087459,\n\
\ \"mc2_stderr\": 0.015087648780065216\n }\n}\n```"
repo_url: https://huggingface.co/pankajmathur/orca_mini_v3_70b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|arc:challenge|25_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hellaswag|10_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T02-12-05.216705.parquet'
- config_name: results
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- results_2023-10-09T02-12-05.216705.parquet
- split: latest
path:
- results_2023-10-09T02-12-05.216705.parquet
---
# Dataset Card for Evaluation run of pankajmathur/orca_mini_v3_70b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/pankajmathur/orca_mini_v3_70b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [pankajmathur/orca_mini_v3_70b](https://huggingface.co/pankajmathur/orca_mini_v3_70b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_pankajmathur__orca_mini_v3_70b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-09T02:12:05.216705](https://huggingface.co/datasets/open-llm-leaderboard/details_pankajmathur__orca_mini_v3_70b/blob/main/results_2023-10-09T02-12-05.216705.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7010508529623596,
"acc_stderr": 0.0309286120388273,
"acc_norm": 0.7049679984523141,
"acc_norm_stderr": 0.030896356315399304,
"mc1": 0.42962056303549573,
"mc1_stderr": 0.017329234580409098,
"mc2": 0.6126968953087459,
"mc2_stderr": 0.015087648780065216
},
"harness|arc:challenge|25": {
"acc": 0.6646757679180887,
"acc_stderr": 0.013796182947785562,
"acc_norm": 0.712457337883959,
"acc_norm_stderr": 0.013226719056266129
},
"harness|hellaswag|10": {
"acc": 0.6951802429794861,
"acc_stderr": 0.00459390260197934,
"acc_norm": 0.8785102569209321,
"acc_norm_stderr": 0.0032602788112468337
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.04171654161354543,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.04171654161354543
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8157894736842105,
"acc_stderr": 0.0315469804508223,
"acc_norm": 0.8157894736842105,
"acc_norm_stderr": 0.0315469804508223
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165044,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165044
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7396226415094339,
"acc_stderr": 0.027008766090708052,
"acc_norm": 0.7396226415094339,
"acc_norm_stderr": 0.027008766090708052
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8125,
"acc_stderr": 0.032639560491693344,
"acc_norm": 0.8125,
"acc_norm_stderr": 0.032639560491693344
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.036146654241808254,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.036146654241808254
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6936170212765957,
"acc_stderr": 0.03013590647851756,
"acc_norm": 0.6936170212765957,
"acc_norm_stderr": 0.03013590647851756
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6344827586206897,
"acc_stderr": 0.040131241954243856,
"acc_norm": 0.6344827586206897,
"acc_norm_stderr": 0.040131241954243856
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.02573364199183898,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.02573364199183898
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04444444444444449,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04444444444444449
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8161290322580645,
"acc_stderr": 0.02203721734026783,
"acc_norm": 0.8161290322580645,
"acc_norm_stderr": 0.02203721734026783
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5665024630541872,
"acc_stderr": 0.034867317274198714,
"acc_norm": 0.5665024630541872,
"acc_norm_stderr": 0.034867317274198714
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932262,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932262
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8424242424242424,
"acc_stderr": 0.02845038880528436,
"acc_norm": 0.8424242424242424,
"acc_norm_stderr": 0.02845038880528436
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8737373737373737,
"acc_stderr": 0.023664359402880232,
"acc_norm": 0.8737373737373737,
"acc_norm_stderr": 0.023664359402880232
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9326424870466321,
"acc_stderr": 0.0180883938390789,
"acc_norm": 0.9326424870466321,
"acc_norm_stderr": 0.0180883938390789
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7025641025641025,
"acc_stderr": 0.023177408131465942,
"acc_norm": 0.7025641025641025,
"acc_norm_stderr": 0.023177408131465942
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.028037929969114986,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.028037929969114986
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7478991596638656,
"acc_stderr": 0.028205545033277723,
"acc_norm": 0.7478991596638656,
"acc_norm_stderr": 0.028205545033277723
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5033112582781457,
"acc_stderr": 0.04082393379449654,
"acc_norm": 0.5033112582781457,
"acc_norm_stderr": 0.04082393379449654
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9009174311926605,
"acc_stderr": 0.01280978008187893,
"acc_norm": 0.9009174311926605,
"acc_norm_stderr": 0.01280978008187893
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5879629629629629,
"acc_stderr": 0.03356787758160831,
"acc_norm": 0.5879629629629629,
"acc_norm_stderr": 0.03356787758160831
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9264705882352942,
"acc_stderr": 0.018318855850089678,
"acc_norm": 0.9264705882352942,
"acc_norm_stderr": 0.018318855850089678
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9029535864978903,
"acc_stderr": 0.019269323025640255,
"acc_norm": 0.9029535864978903,
"acc_norm_stderr": 0.019269323025640255
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7757847533632287,
"acc_stderr": 0.027991534258519513,
"acc_norm": 0.7757847533632287,
"acc_norm_stderr": 0.027991534258519513
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8625954198473282,
"acc_stderr": 0.030194823996804475,
"acc_norm": 0.8625954198473282,
"acc_norm_stderr": 0.030194823996804475
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8760330578512396,
"acc_stderr": 0.030083098716035202,
"acc_norm": 0.8760330578512396,
"acc_norm_stderr": 0.030083098716035202
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8240740740740741,
"acc_stderr": 0.036809181416738807,
"acc_norm": 0.8240740740740741,
"acc_norm_stderr": 0.036809181416738807
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8159509202453987,
"acc_stderr": 0.030446777687971726,
"acc_norm": 0.8159509202453987,
"acc_norm_stderr": 0.030446777687971726
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5178571428571429,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.5178571428571429,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9145299145299145,
"acc_stderr": 0.01831589168562585,
"acc_norm": 0.9145299145299145,
"acc_norm_stderr": 0.01831589168562585
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8748403575989783,
"acc_stderr": 0.011832954239305733,
"acc_norm": 0.8748403575989783,
"acc_norm_stderr": 0.011832954239305733
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.791907514450867,
"acc_stderr": 0.0218552552634218,
"acc_norm": 0.791907514450867,
"acc_norm_stderr": 0.0218552552634218
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5575418994413408,
"acc_stderr": 0.01661139368726857,
"acc_norm": 0.5575418994413408,
"acc_norm_stderr": 0.01661139368726857
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7418300653594772,
"acc_stderr": 0.025058503316958157,
"acc_norm": 0.7418300653594772,
"acc_norm_stderr": 0.025058503316958157
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.77491961414791,
"acc_stderr": 0.023720088516179027,
"acc_norm": 0.77491961414791,
"acc_norm_stderr": 0.023720088516179027
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8209876543209876,
"acc_stderr": 0.021330868762127062,
"acc_norm": 0.8209876543209876,
"acc_norm_stderr": 0.021330868762127062
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5851063829787234,
"acc_stderr": 0.0293922365846125,
"acc_norm": 0.5851063829787234,
"acc_norm_stderr": 0.0293922365846125
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.559973924380704,
"acc_stderr": 0.012678037478574513,
"acc_norm": 0.559973924380704,
"acc_norm_stderr": 0.012678037478574513
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7169117647058824,
"acc_stderr": 0.02736586113151381,
"acc_norm": 0.7169117647058824,
"acc_norm_stderr": 0.02736586113151381
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.017401816711427653,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.017401816711427653
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8,
"acc_stderr": 0.02560737598657916,
"acc_norm": 0.8,
"acc_norm_stderr": 0.02560737598657916
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8756218905472637,
"acc_stderr": 0.023335401790166327,
"acc_norm": 0.8756218905472637,
"acc_norm_stderr": 0.023335401790166327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.028762349126466125,
"acc_norm": 0.91,
"acc_norm_stderr": 0.028762349126466125
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8654970760233918,
"acc_stderr": 0.026168221344662297,
"acc_norm": 0.8654970760233918,
"acc_norm_stderr": 0.026168221344662297
},
"harness|truthfulqa:mc|0": {
"mc1": 0.42962056303549573,
"mc1_stderr": 0.017329234580409098,
"mc2": 0.6126968953087459,
"mc2_stderr": 0.015087648780065216
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
chunpingvi/dataset_tone3a | 2023-10-09T02:44:43.000Z | [
"region:us"
] | chunpingvi | null | null | null | 0 | 0 | Entry not found |
Brthy467/mixeddatasets | 2023-10-09T02:47:53.000Z | [
"region:us"
] | Brthy467 | null | null | null | 0 | 0 | Entry not found |
QEU/databricks-dolly-16k-line_ja-4_of_4 | 2023-10-09T03:14:00.000Z | [
"license:apache-2.0",
"region:us"
] | QEU | null | null | null | 0 | 0 | ---
license: apache-2.0
---
# このデータセットは、2023年に有名になったdatabrick-15kの日本語版です。
## ただし、データは4分割されています。
## データの内容は非常に変わっています。(半分ぐらいは、原型をとどめていません)
- カタカナ語にカッコ付けで英語を追記しました。
- このデータセットには、QnAとして異常なレコードが見られることから修正しました。
- 「ゲームオブスローン」に関するトリビアなど、情報価値が低いものは削除しました。
- その他、いろいろなトライアルとして情報を追加しました。
詳しい情報は[こちらのブログ](https://jpnqeur23lmqsw.blogspot.com/2023/09/qeur23llmdss10-databricks15k.html)を参考にしてください。
|
nc33/task1 | 2023-10-09T03:18:30.000Z | [
"region:us"
] | nc33 | null | null | null | 0 | 0 | ---
dataset_info:
config_name: train
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 1734562359
num_examples: 1236607
download_size: 288424748
dataset_size: 1734562359
configs:
- config_name: train
data_files:
- split: train
path: train/train-*
---
# Dataset Card for "task1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jungtaekkim/datasets-nanophotonic-structures | 2023-10-10T04:26:33.000Z | [
"license:mit",
"region:us"
] | jungtaekkim | null | null | null | 0 | 0 | ---
license: mit
---
|
naphatmanu/index_modern_1 | 2023-10-09T03:37:03.000Z | [
"region:us"
] | naphatmanu | null | null | null | 0 | 0 | Entry not found |
naphatmanu/index-contemporary-1 | 2023-10-09T03:43:11.000Z | [
"license:mit",
"region:us"
] | naphatmanu | null | null | null | 0 | 0 | ---
license: mit
---
|
naphatmanu/index-modern-luxury-1 | 2023-10-09T03:46:23.000Z | [
"region:us"
] | naphatmanu | null | null | null | 0 | 0 | Entry not found |
Brthy467/mixedDataset2 | 2023-10-10T01:07:45.000Z | [
"region:us"
] | Brthy467 | null | null | null | 0 | 0 | Entry not found |
Serjesulegu/Mecanse | 2023-10-09T03:57:39.000Z | [
"license:apache-2.0",
"region:us"
] | Serjesulegu | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_elliotthwang__elliott_Llama-2-7b-hf | 2023-10-09T04:05:42.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of elliotthwang/elliott_Llama-2-7b-hf
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [elliotthwang/elliott_Llama-2-7b-hf](https://huggingface.co/elliotthwang/elliott_Llama-2-7b-hf)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_elliotthwang__elliott_Llama-2-7b-hf\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-09T04:04:19.372525](https://huggingface.co/datasets/open-llm-leaderboard/details_elliotthwang__elliott_Llama-2-7b-hf/blob/main/results_2023-10-09T04-04-19.372525.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.47327947029225664,\n\
\ \"acc_stderr\": 0.03537938484698653,\n \"acc_norm\": 0.47722465501402456,\n\
\ \"acc_norm_stderr\": 0.03536531423897728,\n \"mc1\": 0.28151774785801714,\n\
\ \"mc1_stderr\": 0.01574402724825605,\n \"mc2\": 0.4211238248723632,\n\
\ \"mc2_stderr\": 0.014032921437450484\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4948805460750853,\n \"acc_stderr\": 0.01461062489030916,\n\
\ \"acc_norm\": 0.5315699658703071,\n \"acc_norm_stderr\": 0.014582236460866977\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5872336188010356,\n\
\ \"acc_stderr\": 0.004913253031155689,\n \"acc_norm\": 0.7833100975901215,\n\
\ \"acc_norm_stderr\": 0.004111475588052674\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n\
\ \"acc_stderr\": 0.04313531696750574,\n \"acc_norm\": 0.4740740740740741,\n\
\ \"acc_norm_stderr\": 0.04313531696750574\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4144736842105263,\n \"acc_stderr\": 0.04008973785779206,\n\
\ \"acc_norm\": 0.4144736842105263,\n \"acc_norm_stderr\": 0.04008973785779206\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.47,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4716981132075472,\n \"acc_stderr\": 0.0307235352490061,\n\
\ \"acc_norm\": 0.4716981132075472,\n \"acc_norm_stderr\": 0.0307235352490061\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4513888888888889,\n\
\ \"acc_stderr\": 0.041614023984032786,\n \"acc_norm\": 0.4513888888888889,\n\
\ \"acc_norm_stderr\": 0.041614023984032786\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n\
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.45664739884393063,\n\
\ \"acc_stderr\": 0.03798106566014498,\n \"acc_norm\": 0.45664739884393063,\n\
\ \"acc_norm_stderr\": 0.03798106566014498\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179963,\n\
\ \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179963\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.58,\n \"acc_stderr\": 0.04960449637488583,\n \"acc_norm\": 0.58,\n\
\ \"acc_norm_stderr\": 0.04960449637488583\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4340425531914894,\n \"acc_stderr\": 0.03240038086792747,\n\
\ \"acc_norm\": 0.4340425531914894,\n \"acc_norm_stderr\": 0.03240038086792747\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n\
\ \"acc_stderr\": 0.041857744240220554,\n \"acc_norm\": 0.2719298245614035,\n\
\ \"acc_norm_stderr\": 0.041857744240220554\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4689655172413793,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.4689655172413793,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2619047619047619,\n \"acc_stderr\": 0.022644212615525208,\n \"\
acc_norm\": 0.2619047619047619,\n \"acc_norm_stderr\": 0.022644212615525208\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3253968253968254,\n\
\ \"acc_stderr\": 0.04190596438871136,\n \"acc_norm\": 0.3253968253968254,\n\
\ \"acc_norm_stderr\": 0.04190596438871136\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.028444006199428714,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.028444006199428714\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3399014778325123,\n \"acc_stderr\": 0.0333276906841079,\n\
\ \"acc_norm\": 0.3399014778325123,\n \"acc_norm_stderr\": 0.0333276906841079\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n\
\ \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5757575757575758,\n \"acc_stderr\": 0.03859268142070264,\n\
\ \"acc_norm\": 0.5757575757575758,\n \"acc_norm_stderr\": 0.03859268142070264\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5505050505050505,\n \"acc_stderr\": 0.035441324919479704,\n \"\
acc_norm\": 0.5505050505050505,\n \"acc_norm_stderr\": 0.035441324919479704\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6787564766839378,\n \"acc_stderr\": 0.033699508685490674,\n\
\ \"acc_norm\": 0.6787564766839378,\n \"acc_norm_stderr\": 0.033699508685490674\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.44871794871794873,\n \"acc_stderr\": 0.025217315184846482,\n\
\ \"acc_norm\": 0.44871794871794873,\n \"acc_norm_stderr\": 0.025217315184846482\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3148148148148148,\n \"acc_stderr\": 0.028317533496066465,\n \
\ \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.028317533496066465\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.0322529423239964,\n \
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.0322529423239964\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658753,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658753\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.636697247706422,\n \"acc_stderr\": 0.020620603919625804,\n \"\
acc_norm\": 0.636697247706422,\n \"acc_norm_stderr\": 0.020620603919625804\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3055555555555556,\n \"acc_stderr\": 0.031415546294025445,\n \"\
acc_norm\": 0.3055555555555556,\n \"acc_norm_stderr\": 0.031415546294025445\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5588235294117647,\n \"acc_stderr\": 0.034849415144292316,\n \"\
acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.034849415144292316\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6118143459915611,\n \"acc_stderr\": 0.03172295004332329,\n \
\ \"acc_norm\": 0.6118143459915611,\n \"acc_norm_stderr\": 0.03172295004332329\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.57847533632287,\n\
\ \"acc_stderr\": 0.033141902221106564,\n \"acc_norm\": 0.57847533632287,\n\
\ \"acc_norm_stderr\": 0.033141902221106564\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5725190839694656,\n \"acc_stderr\": 0.04338920305792401,\n\
\ \"acc_norm\": 0.5725190839694656,\n \"acc_norm_stderr\": 0.04338920305792401\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6528925619834711,\n \"acc_stderr\": 0.04345724570292534,\n \"\
acc_norm\": 0.6528925619834711,\n \"acc_norm_stderr\": 0.04345724570292534\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5462962962962963,\n\
\ \"acc_stderr\": 0.04812917324536824,\n \"acc_norm\": 0.5462962962962963,\n\
\ \"acc_norm_stderr\": 0.04812917324536824\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.49079754601226994,\n \"acc_stderr\": 0.03927705600787443,\n\
\ \"acc_norm\": 0.49079754601226994,\n \"acc_norm_stderr\": 0.03927705600787443\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.36607142857142855,\n\
\ \"acc_stderr\": 0.0457237235873743,\n \"acc_norm\": 0.36607142857142855,\n\
\ \"acc_norm_stderr\": 0.0457237235873743\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5728155339805825,\n \"acc_stderr\": 0.048979577377811674,\n\
\ \"acc_norm\": 0.5728155339805825,\n \"acc_norm_stderr\": 0.048979577377811674\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6837606837606838,\n\
\ \"acc_stderr\": 0.03046365674734027,\n \"acc_norm\": 0.6837606837606838,\n\
\ \"acc_norm_stderr\": 0.03046365674734027\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956914,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956914\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6347381864623244,\n\
\ \"acc_stderr\": 0.01721853002883864,\n \"acc_norm\": 0.6347381864623244,\n\
\ \"acc_norm_stderr\": 0.01721853002883864\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5115606936416185,\n \"acc_stderr\": 0.026911898686377927,\n\
\ \"acc_norm\": 0.5115606936416185,\n \"acc_norm_stderr\": 0.026911898686377927\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.4934640522875817,\n \"acc_stderr\": 0.028627470550556047,\n\
\ \"acc_norm\": 0.4934640522875817,\n \"acc_norm_stderr\": 0.028627470550556047\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6045016077170418,\n\
\ \"acc_stderr\": 0.027770918531427838,\n \"acc_norm\": 0.6045016077170418,\n\
\ \"acc_norm_stderr\": 0.027770918531427838\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.49691358024691357,\n \"acc_stderr\": 0.027820214158594384,\n\
\ \"acc_norm\": 0.49691358024691357,\n \"acc_norm_stderr\": 0.027820214158594384\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3829787234042553,\n \"acc_stderr\": 0.02899908090480618,\n \
\ \"acc_norm\": 0.3829787234042553,\n \"acc_norm_stderr\": 0.02899908090480618\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.36571056062581486,\n\
\ \"acc_stderr\": 0.012301028188840567,\n \"acc_norm\": 0.36571056062581486,\n\
\ \"acc_norm_stderr\": 0.012301028188840567\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5330882352941176,\n \"acc_stderr\": 0.03030625772246832,\n\
\ \"acc_norm\": 0.5330882352941176,\n \"acc_norm_stderr\": 0.03030625772246832\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.43790849673202614,\n \"acc_stderr\": 0.020071257886886525,\n \
\ \"acc_norm\": 0.43790849673202614,\n \"acc_norm_stderr\": 0.020071257886886525\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5454545454545454,\n\
\ \"acc_stderr\": 0.04769300568972743,\n \"acc_norm\": 0.5454545454545454,\n\
\ \"acc_norm_stderr\": 0.04769300568972743\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.46530612244897956,\n \"acc_stderr\": 0.03193207024425314,\n\
\ \"acc_norm\": 0.46530612244897956,\n \"acc_norm_stderr\": 0.03193207024425314\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6169154228855721,\n\
\ \"acc_stderr\": 0.034375193373382504,\n \"acc_norm\": 0.6169154228855721,\n\
\ \"acc_norm_stderr\": 0.034375193373382504\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42168674698795183,\n\
\ \"acc_stderr\": 0.03844453181770917,\n \"acc_norm\": 0.42168674698795183,\n\
\ \"acc_norm_stderr\": 0.03844453181770917\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.695906432748538,\n \"acc_stderr\": 0.0352821125824523,\n\
\ \"acc_norm\": 0.695906432748538,\n \"acc_norm_stderr\": 0.0352821125824523\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.28151774785801714,\n\
\ \"mc1_stderr\": 0.01574402724825605,\n \"mc2\": 0.4211238248723632,\n\
\ \"mc2_stderr\": 0.014032921437450484\n }\n}\n```"
repo_url: https://huggingface.co/elliotthwang/elliott_Llama-2-7b-hf
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|arc:challenge|25_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hellaswag|10_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T04-04-19.372525.parquet'
- config_name: results
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- results_2023-10-09T04-04-19.372525.parquet
- split: latest
path:
- results_2023-10-09T04-04-19.372525.parquet
---
# Dataset Card for Evaluation run of elliotthwang/elliott_Llama-2-7b-hf
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/elliotthwang/elliott_Llama-2-7b-hf
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [elliotthwang/elliott_Llama-2-7b-hf](https://huggingface.co/elliotthwang/elliott_Llama-2-7b-hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_elliotthwang__elliott_Llama-2-7b-hf",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-09T04:04:19.372525](https://huggingface.co/datasets/open-llm-leaderboard/details_elliotthwang__elliott_Llama-2-7b-hf/blob/main/results_2023-10-09T04-04-19.372525.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.47327947029225664,
"acc_stderr": 0.03537938484698653,
"acc_norm": 0.47722465501402456,
"acc_norm_stderr": 0.03536531423897728,
"mc1": 0.28151774785801714,
"mc1_stderr": 0.01574402724825605,
"mc2": 0.4211238248723632,
"mc2_stderr": 0.014032921437450484
},
"harness|arc:challenge|25": {
"acc": 0.4948805460750853,
"acc_stderr": 0.01461062489030916,
"acc_norm": 0.5315699658703071,
"acc_norm_stderr": 0.014582236460866977
},
"harness|hellaswag|10": {
"acc": 0.5872336188010356,
"acc_stderr": 0.004913253031155689,
"acc_norm": 0.7833100975901215,
"acc_norm_stderr": 0.004111475588052674
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4740740740740741,
"acc_stderr": 0.04313531696750574,
"acc_norm": 0.4740740740740741,
"acc_norm_stderr": 0.04313531696750574
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4144736842105263,
"acc_stderr": 0.04008973785779206,
"acc_norm": 0.4144736842105263,
"acc_norm_stderr": 0.04008973785779206
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4716981132075472,
"acc_stderr": 0.0307235352490061,
"acc_norm": 0.4716981132075472,
"acc_norm_stderr": 0.0307235352490061
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4513888888888889,
"acc_stderr": 0.041614023984032786,
"acc_norm": 0.4513888888888889,
"acc_norm_stderr": 0.041614023984032786
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.45664739884393063,
"acc_stderr": 0.03798106566014498,
"acc_norm": 0.45664739884393063,
"acc_norm_stderr": 0.03798106566014498
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179963,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179963
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.58,
"acc_stderr": 0.04960449637488583,
"acc_norm": 0.58,
"acc_norm_stderr": 0.04960449637488583
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4340425531914894,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.4340425531914894,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.041857744240220554,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.041857744240220554
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4689655172413793,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.4689655172413793,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.022644212615525208,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.022644212615525208
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3253968253968254,
"acc_stderr": 0.04190596438871136,
"acc_norm": 0.3253968253968254,
"acc_norm_stderr": 0.04190596438871136
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5,
"acc_stderr": 0.028444006199428714,
"acc_norm": 0.5,
"acc_norm_stderr": 0.028444006199428714
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3399014778325123,
"acc_stderr": 0.0333276906841079,
"acc_norm": 0.3399014778325123,
"acc_norm_stderr": 0.0333276906841079
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5757575757575758,
"acc_stderr": 0.03859268142070264,
"acc_norm": 0.5757575757575758,
"acc_norm_stderr": 0.03859268142070264
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5505050505050505,
"acc_stderr": 0.035441324919479704,
"acc_norm": 0.5505050505050505,
"acc_norm_stderr": 0.035441324919479704
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6787564766839378,
"acc_stderr": 0.033699508685490674,
"acc_norm": 0.6787564766839378,
"acc_norm_stderr": 0.033699508685490674
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.44871794871794873,
"acc_stderr": 0.025217315184846482,
"acc_norm": 0.44871794871794873,
"acc_norm_stderr": 0.025217315184846482
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.028317533496066465,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.028317533496066465
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.0322529423239964,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.0322529423239964
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658753,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658753
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.636697247706422,
"acc_stderr": 0.020620603919625804,
"acc_norm": 0.636697247706422,
"acc_norm_stderr": 0.020620603919625804
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3055555555555556,
"acc_stderr": 0.031415546294025445,
"acc_norm": 0.3055555555555556,
"acc_norm_stderr": 0.031415546294025445
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.034849415144292316,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.034849415144292316
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6118143459915611,
"acc_stderr": 0.03172295004332329,
"acc_norm": 0.6118143459915611,
"acc_norm_stderr": 0.03172295004332329
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.57847533632287,
"acc_stderr": 0.033141902221106564,
"acc_norm": 0.57847533632287,
"acc_norm_stderr": 0.033141902221106564
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5725190839694656,
"acc_stderr": 0.04338920305792401,
"acc_norm": 0.5725190839694656,
"acc_norm_stderr": 0.04338920305792401
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6528925619834711,
"acc_stderr": 0.04345724570292534,
"acc_norm": 0.6528925619834711,
"acc_norm_stderr": 0.04345724570292534
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.04812917324536824,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.04812917324536824
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.49079754601226994,
"acc_stderr": 0.03927705600787443,
"acc_norm": 0.49079754601226994,
"acc_norm_stderr": 0.03927705600787443
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.36607142857142855,
"acc_stderr": 0.0457237235873743,
"acc_norm": 0.36607142857142855,
"acc_norm_stderr": 0.0457237235873743
},
"harness|hendrycksTest-management|5": {
"acc": 0.5728155339805825,
"acc_stderr": 0.048979577377811674,
"acc_norm": 0.5728155339805825,
"acc_norm_stderr": 0.048979577377811674
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6837606837606838,
"acc_stderr": 0.03046365674734027,
"acc_norm": 0.6837606837606838,
"acc_norm_stderr": 0.03046365674734027
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956914,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956914
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6347381864623244,
"acc_stderr": 0.01721853002883864,
"acc_norm": 0.6347381864623244,
"acc_norm_stderr": 0.01721853002883864
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5115606936416185,
"acc_stderr": 0.026911898686377927,
"acc_norm": 0.5115606936416185,
"acc_norm_stderr": 0.026911898686377927
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4934640522875817,
"acc_stderr": 0.028627470550556047,
"acc_norm": 0.4934640522875817,
"acc_norm_stderr": 0.028627470550556047
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6045016077170418,
"acc_stderr": 0.027770918531427838,
"acc_norm": 0.6045016077170418,
"acc_norm_stderr": 0.027770918531427838
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.49691358024691357,
"acc_stderr": 0.027820214158594384,
"acc_norm": 0.49691358024691357,
"acc_norm_stderr": 0.027820214158594384
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3829787234042553,
"acc_stderr": 0.02899908090480618,
"acc_norm": 0.3829787234042553,
"acc_norm_stderr": 0.02899908090480618
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.36571056062581486,
"acc_stderr": 0.012301028188840567,
"acc_norm": 0.36571056062581486,
"acc_norm_stderr": 0.012301028188840567
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5330882352941176,
"acc_stderr": 0.03030625772246832,
"acc_norm": 0.5330882352941176,
"acc_norm_stderr": 0.03030625772246832
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.43790849673202614,
"acc_stderr": 0.020071257886886525,
"acc_norm": 0.43790849673202614,
"acc_norm_stderr": 0.020071257886886525
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5454545454545454,
"acc_stderr": 0.04769300568972743,
"acc_norm": 0.5454545454545454,
"acc_norm_stderr": 0.04769300568972743
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.46530612244897956,
"acc_stderr": 0.03193207024425314,
"acc_norm": 0.46530612244897956,
"acc_norm_stderr": 0.03193207024425314
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6169154228855721,
"acc_stderr": 0.034375193373382504,
"acc_norm": 0.6169154228855721,
"acc_norm_stderr": 0.034375193373382504
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42168674698795183,
"acc_stderr": 0.03844453181770917,
"acc_norm": 0.42168674698795183,
"acc_norm_stderr": 0.03844453181770917
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.695906432748538,
"acc_stderr": 0.0352821125824523,
"acc_norm": 0.695906432748538,
"acc_norm_stderr": 0.0352821125824523
},
"harness|truthfulqa:mc|0": {
"mc1": 0.28151774785801714,
"mc1_stderr": 0.01574402724825605,
"mc2": 0.4211238248723632,
"mc2_stderr": 0.014032921437450484
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
naphatmanu/index-modern-neo-classic-1 | 2023-10-09T04:08:42.000Z | [
"license:mit",
"region:us"
] | naphatmanu | null | null | null | 0 | 0 | ---
license: mit
---
|
yukuai0011/elec5307-project-2-dataset-full-public | 2023-10-09T16:20:58.000Z | [
"region:us"
] | yukuai0011 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': Apple
'1': Avocado
'2': Banana
'3': Blueberry
'4': Coconut
'5': Cucumber
'6': Dragon_fruit
'7': Grape
'8': Grapefruit
'9': Kiwifruit
'10': Lemon
'11': Lychee
'12': Mangoes
'13': Orange
'14': Papaya
'15': Passion fruit
'16': Peach
'17': Pear
'18': Pineapple
'19': Pomegranate
'20': Raspberry
'21': Rockmelon
'22': Strawberries
'23': Tomato
'24': Waterlemon
splits:
- name: train
num_bytes: 344011868.018
num_examples: 3026
download_size: 319895933
dataset_size: 344011868.018
---
# Dataset Card for "elec5307-project-2-dataset-full-public"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
naphatmanu/index-natural-scandinavian-1 | 2023-10-09T04:10:29.000Z | [
"license:mit",
"region:us"
] | naphatmanu | null | null | null | 0 | 0 | ---
license: mit
---
|
yukuai0011/elec5307-project-2-dataset-splited-public | 2023-10-09T16:47:41.000Z | [
"region:us"
] | yukuai0011 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': Apple
'1': Avocado
'2': Banana
'3': Blueberry
'4': Coconut
'5': Cucumber
'6': Dragon_fruit
'7': Grape
'8': Grapefruit
'9': Kiwifruit
'10': Lemon
'11': Lychee
'12': Mangoes
'13': Orange
'14': Papaya
'15': Passion fruit
'16': Peach
'17': Pear
'18': Pineapple
'19': Pomegranate
'20': Raspberry
'21': Rockmelon
'22': Strawberries
'23': Tomato
'24': Waterlemon
splits:
- name: train
num_bytes: 270703771.307
num_examples: 2421
- name: test
num_bytes: 63336528.0
num_examples: 605
download_size: 320028339
dataset_size: 334040299.307
---
# Dataset Card for "elec5307-project-2-dataset-splited-public"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MattReiley/first | 2023-10-09T04:19:43.000Z | [
"region:us"
] | MattReiley | null | null | null | 0 | 0 | Entry not found |
Andyrasika/potholes-dataset | 2023-10-09T04:43:06.000Z | [
"region:us"
] | Andyrasika | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 26575443.0
num_examples: 350
- name: validation
num_bytes: 2929769.0
num_examples: 34
- name: test
num_bytes: 1442112.0
num_examples: 16
download_size: 30638600
dataset_size: 30947324.0
---
# Dataset Card for "potholes-dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nanyangtu/LLVisionQA-QBench | 2023-10-09T15:48:09.000Z | [
"license:cc-by-nc-sa-4.0",
"arxiv:2309.14181",
"region:us"
] | nanyangtu | null | null | null | 0 | 0 | ---
license: cc-by-nc-sa-4.0
---
Dataset for Paper: **Q-Bench: A Benchmark for General-Purpose Foundation Models on Low-level Vision**.
*Images*: `images.tar`
`dev`-*labels*: `llvisionqa_dev.json`
`test`-*labels*: `llvisionqa_test.json`
See Github for Usage: https://github.com/vqassessment/q-bench.
Feel free to cite us.
```bibtex
@article{wu2023qbench,
title={Q-Bench: A Benchmark for General-Purpose Foundation Models on Low-level Vision},
author={Wu, Haoning and Zhang, Zicheng and Zhang, Erli and Chen, Chaofeng and Liao, Liang and Wang, Annan and Li, Chunyi and Sun, Wenxiu and Yan, Qiong and Zhai, Guangtao and Lin, Weisi},
year={2023},
eprint={2309.14181},
}
``` |
nanyangtu/LLDescribe-QBench | 2023-10-09T08:26:58.000Z | [
"license:cc-by-nc-sa-4.0",
"arxiv:2309.14181",
"region:us"
] | nanyangtu | null | null | null | 0 | 0 | ---
license: cc-by-nc-sa-4.0
---
Dataset for Paper: **Q-Bench: A Benchmark for General-Purpose Foundation Models on Low-level Vision**.
See Github: https://github.com/vqassessment/q-bench.
Feel free to cite us.
```bibtex
@article{wu2023qbench,
title={Q-Bench: A Benchmark for General-Purpose Foundation Models on Low-level Vision},
author={Wu, Haoning and Zhang, Zicheng and Zhang, Erli and Chen, Chaofeng and Liao, Liang and Wang, Annan and Li, Chunyi and Sun, Wenxiu and Yan, Qiong and Zhai, Guangtao and Lin, Weisi},
year={2023},
eprint={2309.14181},
}
``` |
chunpingvi/tinystories_raw | 2023-10-09T05:26:31.000Z | [
"region:us"
] | chunpingvi | null | null | null | 0 | 0 | Entry not found |
stanmalkinson199/CraigTuckerPTBR | 2023-10-09T05:45:13.000Z | [
"license:openrail",
"region:us"
] | stanmalkinson199 | null | null | null | 0 | 0 | ---
license: openrail
---
|
darcy01/customeDataSet | 2023-10-09T06:08:47.000Z | [
"region:us"
] | darcy01 | null | null | null | 0 | 0 | Entry not found |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.