id stringlengths 2 115 | lastModified stringlengths 24 24 | tags list | author stringlengths 2 42 ⌀ | description stringlengths 0 68.7k ⌀ | citation stringlengths 0 10.7k ⌀ | cardData null | likes int64 0 3.55k | downloads int64 0 10.1M | card stringlengths 0 1.01M |
|---|---|---|---|---|---|---|---|---|---|
open-llm-leaderboard/details_YeungNLP__firefly-llama2-13b-chat | 2023-10-03T11:23:31.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of YeungNLP/firefly-llama2-13b-chat
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [YeungNLP/firefly-llama2-13b-chat](https://huggingface.co/YeungNLP/firefly-llama2-13b-chat)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_YeungNLP__firefly-llama2-13b-chat\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-03T11:22:10.318112](https://huggingface.co/datasets/open-llm-leaderboard/details_YeungNLP__firefly-llama2-13b-chat/blob/main/results_2023-10-03T11-22-10.318112.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5266849015776944,\n\
\ \"acc_stderr\": 0.03489558648435164,\n \"acc_norm\": 0.5306982531715042,\n\
\ \"acc_norm_stderr\": 0.03488010272391423,\n \"mc1\": 0.3243574051407589,\n\
\ \"mc1_stderr\": 0.016387976779647935,\n \"mc2\": 0.48175867456411614,\n\
\ \"mc2_stderr\": 0.015525607847672544\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5273037542662116,\n \"acc_stderr\": 0.014589589101985996,\n\
\ \"acc_norm\": 0.5750853242320819,\n \"acc_norm_stderr\": 0.01444569896852077\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5904202350129456,\n\
\ \"acc_stderr\": 0.0049075121031283446,\n \"acc_norm\": 0.7794264090818562,\n\
\ \"acc_norm_stderr\": 0.004137860370785957\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45925925925925926,\n\
\ \"acc_stderr\": 0.04304979692464243,\n \"acc_norm\": 0.45925925925925926,\n\
\ \"acc_norm_stderr\": 0.04304979692464243\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.48026315789473684,\n \"acc_stderr\": 0.04065771002562605,\n\
\ \"acc_norm\": 0.48026315789473684,\n \"acc_norm_stderr\": 0.04065771002562605\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.569811320754717,\n \"acc_stderr\": 0.030471445867183238,\n\
\ \"acc_norm\": 0.569811320754717,\n \"acc_norm_stderr\": 0.030471445867183238\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5555555555555556,\n\
\ \"acc_stderr\": 0.041553199555931467,\n \"acc_norm\": 0.5555555555555556,\n\
\ \"acc_norm_stderr\": 0.041553199555931467\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\"\
: 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4508670520231214,\n\
\ \"acc_stderr\": 0.037940126746970296,\n \"acc_norm\": 0.4508670520231214,\n\
\ \"acc_norm_stderr\": 0.037940126746970296\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.04488482852329017,\n\
\ \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.04488482852329017\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n\
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.40425531914893614,\n \"acc_stderr\": 0.03208115750788683,\n\
\ \"acc_norm\": 0.40425531914893614,\n \"acc_norm_stderr\": 0.03208115750788683\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n\
\ \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n\
\ \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728763,\n\
\ \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728763\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3333333333333333,\n \"acc_stderr\": 0.024278568024307702,\n \"\
acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.024278568024307702\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3492063492063492,\n\
\ \"acc_stderr\": 0.04263906892795132,\n \"acc_norm\": 0.3492063492063492,\n\
\ \"acc_norm_stderr\": 0.04263906892795132\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6161290322580645,\n\
\ \"acc_stderr\": 0.02766618207553965,\n \"acc_norm\": 0.6161290322580645,\n\
\ \"acc_norm_stderr\": 0.02766618207553965\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3694581280788177,\n \"acc_stderr\": 0.033959703819985726,\n\
\ \"acc_norm\": 0.3694581280788177,\n \"acc_norm_stderr\": 0.033959703819985726\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.55,\n \"acc_stderr\": 0.04999999999999999,\n \"acc_norm\"\
: 0.55,\n \"acc_norm_stderr\": 0.04999999999999999\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6606060606060606,\n \"acc_stderr\": 0.036974422050315967,\n\
\ \"acc_norm\": 0.6606060606060606,\n \"acc_norm_stderr\": 0.036974422050315967\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6565656565656566,\n \"acc_stderr\": 0.03383201223244442,\n \"\
acc_norm\": 0.6565656565656566,\n \"acc_norm_stderr\": 0.03383201223244442\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7253886010362695,\n \"acc_stderr\": 0.032210245080411516,\n\
\ \"acc_norm\": 0.7253886010362695,\n \"acc_norm_stderr\": 0.032210245080411516\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.48205128205128206,\n \"acc_stderr\": 0.025334667080954932,\n\
\ \"acc_norm\": 0.48205128205128206,\n \"acc_norm_stderr\": 0.025334667080954932\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085622,\n \
\ \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085622\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.49159663865546216,\n \"acc_stderr\": 0.03247390276569669,\n\
\ \"acc_norm\": 0.49159663865546216,\n \"acc_norm_stderr\": 0.03247390276569669\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763743,\n \"\
acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763743\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7009174311926606,\n \"acc_stderr\": 0.019630417285415196,\n \"\
acc_norm\": 0.7009174311926606,\n \"acc_norm_stderr\": 0.019630417285415196\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3611111111111111,\n \"acc_stderr\": 0.03275773486100999,\n \"\
acc_norm\": 0.3611111111111111,\n \"acc_norm_stderr\": 0.03275773486100999\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7205882352941176,\n \"acc_stderr\": 0.031493281045079556,\n \"\
acc_norm\": 0.7205882352941176,\n \"acc_norm_stderr\": 0.031493281045079556\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7130801687763713,\n \"acc_stderr\": 0.02944377302259469,\n \
\ \"acc_norm\": 0.7130801687763713,\n \"acc_norm_stderr\": 0.02944377302259469\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6502242152466368,\n\
\ \"acc_stderr\": 0.03200736719484503,\n \"acc_norm\": 0.6502242152466368,\n\
\ \"acc_norm_stderr\": 0.03200736719484503\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6030534351145038,\n \"acc_stderr\": 0.04291135671009225,\n\
\ \"acc_norm\": 0.6030534351145038,\n \"acc_norm_stderr\": 0.04291135671009225\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\"\
: 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6574074074074074,\n\
\ \"acc_stderr\": 0.04587904741301812,\n \"acc_norm\": 0.6574074074074074,\n\
\ \"acc_norm_stderr\": 0.04587904741301812\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6196319018404908,\n \"acc_stderr\": 0.03814269893261837,\n\
\ \"acc_norm\": 0.6196319018404908,\n \"acc_norm_stderr\": 0.03814269893261837\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6504854368932039,\n \"acc_stderr\": 0.04721188506097172,\n\
\ \"acc_norm\": 0.6504854368932039,\n \"acc_norm_stderr\": 0.04721188506097172\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7692307692307693,\n\
\ \"acc_stderr\": 0.027601921381417586,\n \"acc_norm\": 0.7692307692307693,\n\
\ \"acc_norm_stderr\": 0.027601921381417586\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7088122605363985,\n\
\ \"acc_stderr\": 0.016246087069701404,\n \"acc_norm\": 0.7088122605363985,\n\
\ \"acc_norm_stderr\": 0.016246087069701404\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6069364161849711,\n \"acc_stderr\": 0.02629622791561367,\n\
\ \"acc_norm\": 0.6069364161849711,\n \"acc_norm_stderr\": 0.02629622791561367\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3854748603351955,\n\
\ \"acc_stderr\": 0.016277927039638197,\n \"acc_norm\": 0.3854748603351955,\n\
\ \"acc_norm_stderr\": 0.016277927039638197\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5980392156862745,\n \"acc_stderr\": 0.02807415894760065,\n\
\ \"acc_norm\": 0.5980392156862745,\n \"acc_norm_stderr\": 0.02807415894760065\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5884244372990354,\n\
\ \"acc_stderr\": 0.02795048149440126,\n \"acc_norm\": 0.5884244372990354,\n\
\ \"acc_norm_stderr\": 0.02795048149440126\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5771604938271605,\n \"acc_stderr\": 0.027487472980871588,\n\
\ \"acc_norm\": 0.5771604938271605,\n \"acc_norm_stderr\": 0.027487472980871588\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.34397163120567376,\n \"acc_stderr\": 0.02833801742861132,\n \
\ \"acc_norm\": 0.34397163120567376,\n \"acc_norm_stderr\": 0.02833801742861132\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3956975228161669,\n\
\ \"acc_stderr\": 0.012489290735449014,\n \"acc_norm\": 0.3956975228161669,\n\
\ \"acc_norm_stderr\": 0.012489290735449014\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4889705882352941,\n \"acc_stderr\": 0.03036544647727568,\n\
\ \"acc_norm\": 0.4889705882352941,\n \"acc_norm_stderr\": 0.03036544647727568\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5375816993464052,\n \"acc_stderr\": 0.020170614974969768,\n \
\ \"acc_norm\": 0.5375816993464052,\n \"acc_norm_stderr\": 0.020170614974969768\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n\
\ \"acc_stderr\": 0.046534298079135075,\n \"acc_norm\": 0.6181818181818182,\n\
\ \"acc_norm_stderr\": 0.046534298079135075\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6163265306122448,\n \"acc_stderr\": 0.031130880396235936,\n\
\ \"acc_norm\": 0.6163265306122448,\n \"acc_norm_stderr\": 0.031130880396235936\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6865671641791045,\n\
\ \"acc_stderr\": 0.03280188205348642,\n \"acc_norm\": 0.6865671641791045,\n\
\ \"acc_norm_stderr\": 0.03280188205348642\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4759036144578313,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.4759036144578313,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7134502923976608,\n \"acc_stderr\": 0.034678266857038266,\n\
\ \"acc_norm\": 0.7134502923976608,\n \"acc_norm_stderr\": 0.034678266857038266\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3243574051407589,\n\
\ \"mc1_stderr\": 0.016387976779647935,\n \"mc2\": 0.48175867456411614,\n\
\ \"mc2_stderr\": 0.015525607847672544\n }\n}\n```"
repo_url: https://huggingface.co/YeungNLP/firefly-llama2-13b-chat
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_03T11_22_10.318112
path:
- '**/details_harness|arc:challenge|25_2023-10-03T11-22-10.318112.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-03T11-22-10.318112.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_03T11_22_10.318112
path:
- '**/details_harness|hellaswag|10_2023-10-03T11-22-10.318112.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-03T11-22-10.318112.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_03T11_22_10.318112
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T11-22-10.318112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T11-22-10.318112.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T11-22-10.318112.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_03T11_22_10.318112
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T11-22-10.318112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T11-22-10.318112.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_03T11_22_10.318112
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T11-22-10.318112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T11-22-10.318112.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_03T11_22_10.318112
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T11-22-10.318112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T11-22-10.318112.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_03T11_22_10.318112
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T11-22-10.318112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T11-22-10.318112.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_03T11_22_10.318112
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T11-22-10.318112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T11-22-10.318112.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_03T11_22_10.318112
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T11-22-10.318112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T11-22-10.318112.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_03T11_22_10.318112
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T11-22-10.318112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T11-22-10.318112.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_03T11_22_10.318112
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T11-22-10.318112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T11-22-10.318112.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_03T11_22_10.318112
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T11-22-10.318112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T11-22-10.318112.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_03T11_22_10.318112
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T11-22-10.318112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T11-22-10.318112.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_03T11_22_10.318112
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T11-22-10.318112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T11-22-10.318112.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_03T11_22_10.318112
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T11-22-10.318112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T11-22-10.318112.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_03T11_22_10.318112
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T11-22-10.318112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T11-22-10.318112.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_03T11_22_10.318112
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T11-22-10.318112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T11-22-10.318112.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_03T11_22_10.318112
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T11-22-10.318112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T11-22-10.318112.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_03T11_22_10.318112
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T11-22-10.318112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T11-22-10.318112.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_03T11_22_10.318112
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T11-22-10.318112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T11-22-10.318112.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_03T11_22_10.318112
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T11-22-10.318112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T11-22-10.318112.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_03T11_22_10.318112
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T11-22-10.318112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T11-22-10.318112.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_03T11_22_10.318112
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T11-22-10.318112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T11-22-10.318112.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_03T11_22_10.318112
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T11-22-10.318112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T11-22-10.318112.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_03T11_22_10.318112
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T11-22-10.318112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T11-22-10.318112.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_03T11_22_10.318112
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T11-22-10.318112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T11-22-10.318112.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_03T11_22_10.318112
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T11-22-10.318112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T11-22-10.318112.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_03T11_22_10.318112
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T11-22-10.318112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T11-22-10.318112.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_03T11_22_10.318112
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T11-22-10.318112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T11-22-10.318112.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_03T11_22_10.318112
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T11-22-10.318112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T11-22-10.318112.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_03T11_22_10.318112
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T11-22-10.318112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T11-22-10.318112.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_03T11_22_10.318112
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T11-22-10.318112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T11-22-10.318112.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_03T11_22_10.318112
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T11-22-10.318112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T11-22-10.318112.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_03T11_22_10.318112
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T11-22-10.318112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T11-22-10.318112.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_03T11_22_10.318112
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T11-22-10.318112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T11-22-10.318112.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_03T11_22_10.318112
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T11-22-10.318112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T11-22-10.318112.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_03T11_22_10.318112
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T11-22-10.318112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T11-22-10.318112.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_03T11_22_10.318112
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T11-22-10.318112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T11-22-10.318112.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_03T11_22_10.318112
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T11-22-10.318112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T11-22-10.318112.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_03T11_22_10.318112
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T11-22-10.318112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T11-22-10.318112.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_03T11_22_10.318112
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T11-22-10.318112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T11-22-10.318112.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_03T11_22_10.318112
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T11-22-10.318112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T11-22-10.318112.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_03T11_22_10.318112
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T11-22-10.318112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T11-22-10.318112.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_03T11_22_10.318112
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T11-22-10.318112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T11-22-10.318112.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_03T11_22_10.318112
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T11-22-10.318112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T11-22-10.318112.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_03T11_22_10.318112
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T11-22-10.318112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T11-22-10.318112.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_03T11_22_10.318112
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T11-22-10.318112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T11-22-10.318112.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_03T11_22_10.318112
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T11-22-10.318112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T11-22-10.318112.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_03T11_22_10.318112
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T11-22-10.318112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T11-22-10.318112.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_03T11_22_10.318112
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T11-22-10.318112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T11-22-10.318112.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_03T11_22_10.318112
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T11-22-10.318112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T11-22-10.318112.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_03T11_22_10.318112
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T11-22-10.318112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T11-22-10.318112.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_03T11_22_10.318112
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T11-22-10.318112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T11-22-10.318112.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_03T11_22_10.318112
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T11-22-10.318112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T11-22-10.318112.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_03T11_22_10.318112
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T11-22-10.318112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T11-22-10.318112.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_03T11_22_10.318112
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T11-22-10.318112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T11-22-10.318112.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_03T11_22_10.318112
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T11-22-10.318112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T11-22-10.318112.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_03T11_22_10.318112
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T11-22-10.318112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T11-22-10.318112.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_03T11_22_10.318112
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T11-22-10.318112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T11-22-10.318112.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_03T11_22_10.318112
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T11-22-10.318112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T11-22-10.318112.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_03T11_22_10.318112
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T11-22-10.318112.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T11-22-10.318112.parquet'
- config_name: results
data_files:
- split: 2023_10_03T11_22_10.318112
path:
- results_2023-10-03T11-22-10.318112.parquet
- split: latest
path:
- results_2023-10-03T11-22-10.318112.parquet
---
# Dataset Card for Evaluation run of YeungNLP/firefly-llama2-13b-chat
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/YeungNLP/firefly-llama2-13b-chat
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [YeungNLP/firefly-llama2-13b-chat](https://huggingface.co/YeungNLP/firefly-llama2-13b-chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_YeungNLP__firefly-llama2-13b-chat",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-03T11:22:10.318112](https://huggingface.co/datasets/open-llm-leaderboard/details_YeungNLP__firefly-llama2-13b-chat/blob/main/results_2023-10-03T11-22-10.318112.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5266849015776944,
"acc_stderr": 0.03489558648435164,
"acc_norm": 0.5306982531715042,
"acc_norm_stderr": 0.03488010272391423,
"mc1": 0.3243574051407589,
"mc1_stderr": 0.016387976779647935,
"mc2": 0.48175867456411614,
"mc2_stderr": 0.015525607847672544
},
"harness|arc:challenge|25": {
"acc": 0.5273037542662116,
"acc_stderr": 0.014589589101985996,
"acc_norm": 0.5750853242320819,
"acc_norm_stderr": 0.01444569896852077
},
"harness|hellaswag|10": {
"acc": 0.5904202350129456,
"acc_stderr": 0.0049075121031283446,
"acc_norm": 0.7794264090818562,
"acc_norm_stderr": 0.004137860370785957
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45925925925925926,
"acc_stderr": 0.04304979692464243,
"acc_norm": 0.45925925925925926,
"acc_norm_stderr": 0.04304979692464243
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.48026315789473684,
"acc_stderr": 0.04065771002562605,
"acc_norm": 0.48026315789473684,
"acc_norm_stderr": 0.04065771002562605
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.569811320754717,
"acc_stderr": 0.030471445867183238,
"acc_norm": 0.569811320754717,
"acc_norm_stderr": 0.030471445867183238
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.041553199555931467,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.041553199555931467
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4508670520231214,
"acc_stderr": 0.037940126746970296,
"acc_norm": 0.4508670520231214,
"acc_norm_stderr": 0.037940126746970296
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.04488482852329017,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.04488482852329017
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.40425531914893614,
"acc_stderr": 0.03208115750788683,
"acc_norm": 0.40425531914893614,
"acc_norm_stderr": 0.03208115750788683
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022056,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022056
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.024278568024307702,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.024278568024307702
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3492063492063492,
"acc_stderr": 0.04263906892795132,
"acc_norm": 0.3492063492063492,
"acc_norm_stderr": 0.04263906892795132
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6161290322580645,
"acc_stderr": 0.02766618207553965,
"acc_norm": 0.6161290322580645,
"acc_norm_stderr": 0.02766618207553965
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3694581280788177,
"acc_stderr": 0.033959703819985726,
"acc_norm": 0.3694581280788177,
"acc_norm_stderr": 0.033959703819985726
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.55,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6606060606060606,
"acc_stderr": 0.036974422050315967,
"acc_norm": 0.6606060606060606,
"acc_norm_stderr": 0.036974422050315967
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6565656565656566,
"acc_stderr": 0.03383201223244442,
"acc_norm": 0.6565656565656566,
"acc_norm_stderr": 0.03383201223244442
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7253886010362695,
"acc_stderr": 0.032210245080411516,
"acc_norm": 0.7253886010362695,
"acc_norm_stderr": 0.032210245080411516
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.48205128205128206,
"acc_stderr": 0.025334667080954932,
"acc_norm": 0.48205128205128206,
"acc_norm_stderr": 0.025334667080954932
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.027195934804085622,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.027195934804085622
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.49159663865546216,
"acc_stderr": 0.03247390276569669,
"acc_norm": 0.49159663865546216,
"acc_norm_stderr": 0.03247390276569669
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2781456953642384,
"acc_stderr": 0.03658603262763743,
"acc_norm": 0.2781456953642384,
"acc_norm_stderr": 0.03658603262763743
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7009174311926606,
"acc_stderr": 0.019630417285415196,
"acc_norm": 0.7009174311926606,
"acc_norm_stderr": 0.019630417285415196
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3611111111111111,
"acc_stderr": 0.03275773486100999,
"acc_norm": 0.3611111111111111,
"acc_norm_stderr": 0.03275773486100999
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7205882352941176,
"acc_stderr": 0.031493281045079556,
"acc_norm": 0.7205882352941176,
"acc_norm_stderr": 0.031493281045079556
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7130801687763713,
"acc_stderr": 0.02944377302259469,
"acc_norm": 0.7130801687763713,
"acc_norm_stderr": 0.02944377302259469
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6502242152466368,
"acc_stderr": 0.03200736719484503,
"acc_norm": 0.6502242152466368,
"acc_norm_stderr": 0.03200736719484503
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6030534351145038,
"acc_stderr": 0.04291135671009225,
"acc_norm": 0.6030534351145038,
"acc_norm_stderr": 0.04291135671009225
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302872,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302872
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6574074074074074,
"acc_stderr": 0.04587904741301812,
"acc_norm": 0.6574074074074074,
"acc_norm_stderr": 0.04587904741301812
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6196319018404908,
"acc_stderr": 0.03814269893261837,
"acc_norm": 0.6196319018404908,
"acc_norm_stderr": 0.03814269893261837
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.6504854368932039,
"acc_stderr": 0.04721188506097172,
"acc_norm": 0.6504854368932039,
"acc_norm_stderr": 0.04721188506097172
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7692307692307693,
"acc_stderr": 0.027601921381417586,
"acc_norm": 0.7692307692307693,
"acc_norm_stderr": 0.027601921381417586
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7088122605363985,
"acc_stderr": 0.016246087069701404,
"acc_norm": 0.7088122605363985,
"acc_norm_stderr": 0.016246087069701404
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6069364161849711,
"acc_stderr": 0.02629622791561367,
"acc_norm": 0.6069364161849711,
"acc_norm_stderr": 0.02629622791561367
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3854748603351955,
"acc_stderr": 0.016277927039638197,
"acc_norm": 0.3854748603351955,
"acc_norm_stderr": 0.016277927039638197
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5980392156862745,
"acc_stderr": 0.02807415894760065,
"acc_norm": 0.5980392156862745,
"acc_norm_stderr": 0.02807415894760065
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5884244372990354,
"acc_stderr": 0.02795048149440126,
"acc_norm": 0.5884244372990354,
"acc_norm_stderr": 0.02795048149440126
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5771604938271605,
"acc_stderr": 0.027487472980871588,
"acc_norm": 0.5771604938271605,
"acc_norm_stderr": 0.027487472980871588
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.34397163120567376,
"acc_stderr": 0.02833801742861132,
"acc_norm": 0.34397163120567376,
"acc_norm_stderr": 0.02833801742861132
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3956975228161669,
"acc_stderr": 0.012489290735449014,
"acc_norm": 0.3956975228161669,
"acc_norm_stderr": 0.012489290735449014
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4889705882352941,
"acc_stderr": 0.03036544647727568,
"acc_norm": 0.4889705882352941,
"acc_norm_stderr": 0.03036544647727568
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5375816993464052,
"acc_stderr": 0.020170614974969768,
"acc_norm": 0.5375816993464052,
"acc_norm_stderr": 0.020170614974969768
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.046534298079135075,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.046534298079135075
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6163265306122448,
"acc_stderr": 0.031130880396235936,
"acc_norm": 0.6163265306122448,
"acc_norm_stderr": 0.031130880396235936
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6865671641791045,
"acc_stderr": 0.03280188205348642,
"acc_norm": 0.6865671641791045,
"acc_norm_stderr": 0.03280188205348642
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4759036144578313,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.4759036144578313,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7134502923976608,
"acc_stderr": 0.034678266857038266,
"acc_norm": 0.7134502923976608,
"acc_norm_stderr": 0.034678266857038266
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3243574051407589,
"mc1_stderr": 0.016387976779647935,
"mc2": 0.48175867456411614,
"mc2_stderr": 0.015525607847672544
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
atom-in-the-universe/bild-a11d2fa3-7db1-4a85-baf0-6a017ab7e5da | 2023-10-03T11:29:52.000Z | [
"region:us"
] | atom-in-the-universe | null | null | null | 0 | 0 | Entry not found |
osmanh/llama2 | 2023-10-03T11:34:12.000Z | [
"region:us"
] | osmanh | null | null | null | 0 | 0 | Entry not found |
umitmertcakmak/mental_health_chatbot_dataset | 2023-10-03T11:34:32.000Z | [
"region:us"
] | umitmertcakmak | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 189421
num_examples: 172
download_size: 102272
dataset_size: 189421
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "mental_health_chatbot_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_PocketDoc__Dans-MysteryModel-13b | 2023-10-03T11:40:52.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of PocketDoc/Dans-MysteryModel-13b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [PocketDoc/Dans-MysteryModel-13b](https://huggingface.co/PocketDoc/Dans-MysteryModel-13b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PocketDoc__Dans-MysteryModel-13b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-03T11:39:23.450846](https://huggingface.co/datasets/open-llm-leaderboard/details_PocketDoc__Dans-MysteryModel-13b/blob/main/results_2023-10-03T11-39-23.450846.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5219975723114464,\n\
\ \"acc_stderr\": 0.03489138000510383,\n \"acc_norm\": 0.5262045630993822,\n\
\ \"acc_norm_stderr\": 0.03487360003272065,\n \"mc1\": 0.29498164014687883,\n\
\ \"mc1_stderr\": 0.015964400965589664,\n \"mc2\": 0.450035556324201,\n\
\ \"mc2_stderr\": 0.01480761962073364\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5255972696245734,\n \"acc_stderr\": 0.014592230885298962,\n\
\ \"acc_norm\": 0.5699658703071673,\n \"acc_norm_stderr\": 0.014467631559137986\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.599681338378809,\n\
\ \"acc_stderr\": 0.0048896154131441915,\n \"acc_norm\": 0.8035251941844254,\n\
\ \"acc_norm_stderr\": 0.003965196368697836\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5111111111111111,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.5111111111111111,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04068942293855797,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04068942293855797\n },\n\
\ \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n \
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5924528301886792,\n \"acc_stderr\": 0.030242233800854498,\n\
\ \"acc_norm\": 0.5924528301886792,\n \"acc_norm_stderr\": 0.030242233800854498\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5138888888888888,\n\
\ \"acc_stderr\": 0.041795966175810016,\n \"acc_norm\": 0.5138888888888888,\n\
\ \"acc_norm_stderr\": 0.041795966175810016\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\"\
: 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.48554913294797686,\n\
\ \"acc_stderr\": 0.03810871630454764,\n \"acc_norm\": 0.48554913294797686,\n\
\ \"acc_norm_stderr\": 0.03810871630454764\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.04655010411319616,\n\
\ \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.04655010411319616\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n\
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.425531914893617,\n \"acc_stderr\": 0.03232146916224468,\n\
\ \"acc_norm\": 0.425531914893617,\n \"acc_norm_stderr\": 0.03232146916224468\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3157894736842105,\n\
\ \"acc_stderr\": 0.043727482902780064,\n \"acc_norm\": 0.3157894736842105,\n\
\ \"acc_norm_stderr\": 0.043727482902780064\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.45517241379310347,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.45517241379310347,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.31216931216931215,\n \"acc_stderr\": 0.023865206836972595,\n \"\
acc_norm\": 0.31216931216931215,\n \"acc_norm_stderr\": 0.023865206836972595\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.25396825396825395,\n\
\ \"acc_stderr\": 0.038932596106046734,\n \"acc_norm\": 0.25396825396825395,\n\
\ \"acc_norm_stderr\": 0.038932596106046734\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6193548387096774,\n\
\ \"acc_stderr\": 0.027621717832907032,\n \"acc_norm\": 0.6193548387096774,\n\
\ \"acc_norm_stderr\": 0.027621717832907032\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.45320197044334976,\n \"acc_stderr\": 0.035025446508458714,\n\
\ \"acc_norm\": 0.45320197044334976,\n \"acc_norm_stderr\": 0.035025446508458714\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6181818181818182,\n \"acc_stderr\": 0.03793713171165636,\n\
\ \"acc_norm\": 0.6181818181818182,\n \"acc_norm_stderr\": 0.03793713171165636\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6717171717171717,\n \"acc_stderr\": 0.03345678422756775,\n \"\
acc_norm\": 0.6717171717171717,\n \"acc_norm_stderr\": 0.03345678422756775\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7409326424870466,\n \"acc_stderr\": 0.03161877917935413,\n\
\ \"acc_norm\": 0.7409326424870466,\n \"acc_norm_stderr\": 0.03161877917935413\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4666666666666667,\n \"acc_stderr\": 0.02529460802398647,\n \
\ \"acc_norm\": 0.4666666666666667,\n \"acc_norm_stderr\": 0.02529460802398647\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2518518518518518,\n \"acc_stderr\": 0.026466117538959912,\n \
\ \"acc_norm\": 0.2518518518518518,\n \"acc_norm_stderr\": 0.026466117538959912\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5,\n \"acc_stderr\": 0.032478490123081544,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.032478490123081544\n },\n \"harness|hendrycksTest-high_school_physics|5\"\
: {\n \"acc\": 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n\
\ \"acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6678899082568808,\n \"acc_stderr\": 0.020192682985423333,\n \"\
acc_norm\": 0.6678899082568808,\n \"acc_norm_stderr\": 0.020192682985423333\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.39814814814814814,\n \"acc_stderr\": 0.03338473403207401,\n \"\
acc_norm\": 0.39814814814814814,\n \"acc_norm_stderr\": 0.03338473403207401\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7205882352941176,\n \"acc_stderr\": 0.031493281045079556,\n \"\
acc_norm\": 0.7205882352941176,\n \"acc_norm_stderr\": 0.031493281045079556\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6877637130801688,\n \"acc_stderr\": 0.030165137867847008,\n \
\ \"acc_norm\": 0.6877637130801688,\n \"acc_norm_stderr\": 0.030165137867847008\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6367713004484304,\n\
\ \"acc_stderr\": 0.032277904428505,\n \"acc_norm\": 0.6367713004484304,\n\
\ \"acc_norm_stderr\": 0.032277904428505\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6412213740458015,\n \"acc_stderr\": 0.04206739313864908,\n\
\ \"acc_norm\": 0.6412213740458015,\n \"acc_norm_stderr\": 0.04206739313864908\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6859504132231405,\n \"acc_stderr\": 0.042369647530410184,\n \"\
acc_norm\": 0.6859504132231405,\n \"acc_norm_stderr\": 0.042369647530410184\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6759259259259259,\n\
\ \"acc_stderr\": 0.045245960070300476,\n \"acc_norm\": 0.6759259259259259,\n\
\ \"acc_norm_stderr\": 0.045245960070300476\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6012269938650306,\n \"acc_stderr\": 0.03847021420456023,\n\
\ \"acc_norm\": 0.6012269938650306,\n \"acc_norm_stderr\": 0.03847021420456023\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n\
\ \"acc_stderr\": 0.04464285714285714,\n \"acc_norm\": 0.33035714285714285,\n\
\ \"acc_norm_stderr\": 0.04464285714285714\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6893203883495146,\n \"acc_stderr\": 0.04582124160161549,\n\
\ \"acc_norm\": 0.6893203883495146,\n \"acc_norm_stderr\": 0.04582124160161549\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7435897435897436,\n\
\ \"acc_stderr\": 0.028605953702004274,\n \"acc_norm\": 0.7435897435897436,\n\
\ \"acc_norm_stderr\": 0.028605953702004274\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.05021167315686779,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.05021167315686779\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6909323116219668,\n\
\ \"acc_stderr\": 0.016524988919702215,\n \"acc_norm\": 0.6909323116219668,\n\
\ \"acc_norm_stderr\": 0.016524988919702215\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6069364161849711,\n \"acc_stderr\": 0.02629622791561367,\n\
\ \"acc_norm\": 0.6069364161849711,\n \"acc_norm_stderr\": 0.02629622791561367\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5915032679738562,\n \"acc_stderr\": 0.028146405993096358,\n\
\ \"acc_norm\": 0.5915032679738562,\n \"acc_norm_stderr\": 0.028146405993096358\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.572347266881029,\n\
\ \"acc_stderr\": 0.028099240775809553,\n \"acc_norm\": 0.572347266881029,\n\
\ \"acc_norm_stderr\": 0.028099240775809553\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5802469135802469,\n \"acc_stderr\": 0.027460099557005135,\n\
\ \"acc_norm\": 0.5802469135802469,\n \"acc_norm_stderr\": 0.027460099557005135\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.40070921985815605,\n \"acc_stderr\": 0.02923346574557309,\n \
\ \"acc_norm\": 0.40070921985815605,\n \"acc_norm_stderr\": 0.02923346574557309\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4002607561929596,\n\
\ \"acc_stderr\": 0.012513582529136216,\n \"acc_norm\": 0.4002607561929596,\n\
\ \"acc_norm_stderr\": 0.012513582529136216\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.48161764705882354,\n \"acc_stderr\": 0.030352303395351964,\n\
\ \"acc_norm\": 0.48161764705882354,\n \"acc_norm_stderr\": 0.030352303395351964\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5016339869281046,\n \"acc_stderr\": 0.020227726838150117,\n \
\ \"acc_norm\": 0.5016339869281046,\n \"acc_norm_stderr\": 0.020227726838150117\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5909090909090909,\n\
\ \"acc_stderr\": 0.04709306978661896,\n \"acc_norm\": 0.5909090909090909,\n\
\ \"acc_norm_stderr\": 0.04709306978661896\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5918367346938775,\n \"acc_stderr\": 0.03146465712827424,\n\
\ \"acc_norm\": 0.5918367346938775,\n \"acc_norm_stderr\": 0.03146465712827424\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7164179104477612,\n\
\ \"acc_stderr\": 0.03187187537919797,\n \"acc_norm\": 0.7164179104477612,\n\
\ \"acc_norm_stderr\": 0.03187187537919797\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.78,\n \"acc_stderr\": 0.041633319989322626,\n \
\ \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.041633319989322626\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4036144578313253,\n\
\ \"acc_stderr\": 0.03819486140758398,\n \"acc_norm\": 0.4036144578313253,\n\
\ \"acc_norm_stderr\": 0.03819486140758398\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7134502923976608,\n \"acc_stderr\": 0.03467826685703826,\n\
\ \"acc_norm\": 0.7134502923976608,\n \"acc_norm_stderr\": 0.03467826685703826\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.29498164014687883,\n\
\ \"mc1_stderr\": 0.015964400965589664,\n \"mc2\": 0.450035556324201,\n\
\ \"mc2_stderr\": 0.01480761962073364\n }\n}\n```"
repo_url: https://huggingface.co/PocketDoc/Dans-MysteryModel-13b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_03T11_39_23.450846
path:
- '**/details_harness|arc:challenge|25_2023-10-03T11-39-23.450846.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-03T11-39-23.450846.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_03T11_39_23.450846
path:
- '**/details_harness|hellaswag|10_2023-10-03T11-39-23.450846.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-03T11-39-23.450846.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_03T11_39_23.450846
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T11-39-23.450846.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T11-39-23.450846.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T11-39-23.450846.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_03T11_39_23.450846
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T11-39-23.450846.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T11-39-23.450846.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_03T11_39_23.450846
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T11-39-23.450846.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T11-39-23.450846.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_03T11_39_23.450846
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T11-39-23.450846.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T11-39-23.450846.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_03T11_39_23.450846
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T11-39-23.450846.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T11-39-23.450846.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_03T11_39_23.450846
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T11-39-23.450846.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T11-39-23.450846.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_03T11_39_23.450846
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T11-39-23.450846.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T11-39-23.450846.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_03T11_39_23.450846
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T11-39-23.450846.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T11-39-23.450846.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_03T11_39_23.450846
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T11-39-23.450846.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T11-39-23.450846.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_03T11_39_23.450846
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T11-39-23.450846.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T11-39-23.450846.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_03T11_39_23.450846
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T11-39-23.450846.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T11-39-23.450846.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_03T11_39_23.450846
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T11-39-23.450846.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T11-39-23.450846.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_03T11_39_23.450846
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T11-39-23.450846.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T11-39-23.450846.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_03T11_39_23.450846
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T11-39-23.450846.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T11-39-23.450846.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_03T11_39_23.450846
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T11-39-23.450846.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T11-39-23.450846.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_03T11_39_23.450846
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T11-39-23.450846.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T11-39-23.450846.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_03T11_39_23.450846
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T11-39-23.450846.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T11-39-23.450846.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_03T11_39_23.450846
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T11-39-23.450846.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T11-39-23.450846.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_03T11_39_23.450846
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T11-39-23.450846.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T11-39-23.450846.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_03T11_39_23.450846
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T11-39-23.450846.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T11-39-23.450846.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_03T11_39_23.450846
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T11-39-23.450846.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T11-39-23.450846.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_03T11_39_23.450846
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T11-39-23.450846.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T11-39-23.450846.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_03T11_39_23.450846
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T11-39-23.450846.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T11-39-23.450846.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_03T11_39_23.450846
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T11-39-23.450846.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T11-39-23.450846.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_03T11_39_23.450846
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T11-39-23.450846.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T11-39-23.450846.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_03T11_39_23.450846
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T11-39-23.450846.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T11-39-23.450846.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_03T11_39_23.450846
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T11-39-23.450846.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T11-39-23.450846.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_03T11_39_23.450846
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T11-39-23.450846.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T11-39-23.450846.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_03T11_39_23.450846
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T11-39-23.450846.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T11-39-23.450846.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_03T11_39_23.450846
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T11-39-23.450846.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T11-39-23.450846.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_03T11_39_23.450846
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T11-39-23.450846.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T11-39-23.450846.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_03T11_39_23.450846
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T11-39-23.450846.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T11-39-23.450846.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_03T11_39_23.450846
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T11-39-23.450846.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T11-39-23.450846.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_03T11_39_23.450846
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T11-39-23.450846.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T11-39-23.450846.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_03T11_39_23.450846
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T11-39-23.450846.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T11-39-23.450846.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_03T11_39_23.450846
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T11-39-23.450846.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T11-39-23.450846.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_03T11_39_23.450846
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T11-39-23.450846.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T11-39-23.450846.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_03T11_39_23.450846
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T11-39-23.450846.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T11-39-23.450846.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_03T11_39_23.450846
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T11-39-23.450846.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T11-39-23.450846.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_03T11_39_23.450846
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T11-39-23.450846.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T11-39-23.450846.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_03T11_39_23.450846
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T11-39-23.450846.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T11-39-23.450846.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_03T11_39_23.450846
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T11-39-23.450846.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T11-39-23.450846.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_03T11_39_23.450846
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T11-39-23.450846.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T11-39-23.450846.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_03T11_39_23.450846
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T11-39-23.450846.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T11-39-23.450846.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_03T11_39_23.450846
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T11-39-23.450846.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T11-39-23.450846.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_03T11_39_23.450846
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T11-39-23.450846.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T11-39-23.450846.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_03T11_39_23.450846
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T11-39-23.450846.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T11-39-23.450846.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_03T11_39_23.450846
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T11-39-23.450846.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T11-39-23.450846.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_03T11_39_23.450846
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T11-39-23.450846.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T11-39-23.450846.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_03T11_39_23.450846
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T11-39-23.450846.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T11-39-23.450846.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_03T11_39_23.450846
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T11-39-23.450846.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T11-39-23.450846.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_03T11_39_23.450846
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T11-39-23.450846.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T11-39-23.450846.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_03T11_39_23.450846
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T11-39-23.450846.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T11-39-23.450846.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_03T11_39_23.450846
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T11-39-23.450846.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T11-39-23.450846.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_03T11_39_23.450846
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T11-39-23.450846.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T11-39-23.450846.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_03T11_39_23.450846
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T11-39-23.450846.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T11-39-23.450846.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_03T11_39_23.450846
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T11-39-23.450846.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T11-39-23.450846.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_03T11_39_23.450846
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T11-39-23.450846.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T11-39-23.450846.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_03T11_39_23.450846
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T11-39-23.450846.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T11-39-23.450846.parquet'
- config_name: results
data_files:
- split: 2023_10_03T11_39_23.450846
path:
- results_2023-10-03T11-39-23.450846.parquet
- split: latest
path:
- results_2023-10-03T11-39-23.450846.parquet
---
# Dataset Card for Evaluation run of PocketDoc/Dans-MysteryModel-13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/PocketDoc/Dans-MysteryModel-13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [PocketDoc/Dans-MysteryModel-13b](https://huggingface.co/PocketDoc/Dans-MysteryModel-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_PocketDoc__Dans-MysteryModel-13b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-03T11:39:23.450846](https://huggingface.co/datasets/open-llm-leaderboard/details_PocketDoc__Dans-MysteryModel-13b/blob/main/results_2023-10-03T11-39-23.450846.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5219975723114464,
"acc_stderr": 0.03489138000510383,
"acc_norm": 0.5262045630993822,
"acc_norm_stderr": 0.03487360003272065,
"mc1": 0.29498164014687883,
"mc1_stderr": 0.015964400965589664,
"mc2": 0.450035556324201,
"mc2_stderr": 0.01480761962073364
},
"harness|arc:challenge|25": {
"acc": 0.5255972696245734,
"acc_stderr": 0.014592230885298962,
"acc_norm": 0.5699658703071673,
"acc_norm_stderr": 0.014467631559137986
},
"harness|hellaswag|10": {
"acc": 0.599681338378809,
"acc_stderr": 0.0048896154131441915,
"acc_norm": 0.8035251941844254,
"acc_norm_stderr": 0.003965196368697836
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5111111111111111,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.5111111111111111,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5,
"acc_stderr": 0.04068942293855797,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04068942293855797
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5924528301886792,
"acc_stderr": 0.030242233800854498,
"acc_norm": 0.5924528301886792,
"acc_norm_stderr": 0.030242233800854498
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.041795966175810016,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.041795966175810016
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.48554913294797686,
"acc_stderr": 0.03810871630454764,
"acc_norm": 0.48554913294797686,
"acc_norm_stderr": 0.03810871630454764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.04655010411319616,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.04655010411319616
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.425531914893617,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.425531914893617,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.043727482902780064,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.043727482902780064
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.45517241379310347,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.45517241379310347,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.31216931216931215,
"acc_stderr": 0.023865206836972595,
"acc_norm": 0.31216931216931215,
"acc_norm_stderr": 0.023865206836972595
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.038932596106046734,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.038932596106046734
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6193548387096774,
"acc_stderr": 0.027621717832907032,
"acc_norm": 0.6193548387096774,
"acc_norm_stderr": 0.027621717832907032
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.45320197044334976,
"acc_stderr": 0.035025446508458714,
"acc_norm": 0.45320197044334976,
"acc_norm_stderr": 0.035025446508458714
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.03793713171165636,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.03793713171165636
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6717171717171717,
"acc_stderr": 0.03345678422756775,
"acc_norm": 0.6717171717171717,
"acc_norm_stderr": 0.03345678422756775
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7409326424870466,
"acc_stderr": 0.03161877917935413,
"acc_norm": 0.7409326424870466,
"acc_norm_stderr": 0.03161877917935413
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.02529460802398647,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.02529460802398647
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.026466117538959912,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.026466117538959912
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5,
"acc_stderr": 0.032478490123081544,
"acc_norm": 0.5,
"acc_norm_stderr": 0.032478490123081544
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6678899082568808,
"acc_stderr": 0.020192682985423333,
"acc_norm": 0.6678899082568808,
"acc_norm_stderr": 0.020192682985423333
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.39814814814814814,
"acc_stderr": 0.03338473403207401,
"acc_norm": 0.39814814814814814,
"acc_norm_stderr": 0.03338473403207401
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7205882352941176,
"acc_stderr": 0.031493281045079556,
"acc_norm": 0.7205882352941176,
"acc_norm_stderr": 0.031493281045079556
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6877637130801688,
"acc_stderr": 0.030165137867847008,
"acc_norm": 0.6877637130801688,
"acc_norm_stderr": 0.030165137867847008
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6367713004484304,
"acc_stderr": 0.032277904428505,
"acc_norm": 0.6367713004484304,
"acc_norm_stderr": 0.032277904428505
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6412213740458015,
"acc_stderr": 0.04206739313864908,
"acc_norm": 0.6412213740458015,
"acc_norm_stderr": 0.04206739313864908
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6859504132231405,
"acc_stderr": 0.042369647530410184,
"acc_norm": 0.6859504132231405,
"acc_norm_stderr": 0.042369647530410184
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6759259259259259,
"acc_stderr": 0.045245960070300476,
"acc_norm": 0.6759259259259259,
"acc_norm_stderr": 0.045245960070300476
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6012269938650306,
"acc_stderr": 0.03847021420456023,
"acc_norm": 0.6012269938650306,
"acc_norm_stderr": 0.03847021420456023
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.33035714285714285,
"acc_stderr": 0.04464285714285714,
"acc_norm": 0.33035714285714285,
"acc_norm_stderr": 0.04464285714285714
},
"harness|hendrycksTest-management|5": {
"acc": 0.6893203883495146,
"acc_stderr": 0.04582124160161549,
"acc_norm": 0.6893203883495146,
"acc_norm_stderr": 0.04582124160161549
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7435897435897436,
"acc_stderr": 0.028605953702004274,
"acc_norm": 0.7435897435897436,
"acc_norm_stderr": 0.028605953702004274
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.52,
"acc_stderr": 0.05021167315686779,
"acc_norm": 0.52,
"acc_norm_stderr": 0.05021167315686779
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6909323116219668,
"acc_stderr": 0.016524988919702215,
"acc_norm": 0.6909323116219668,
"acc_norm_stderr": 0.016524988919702215
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6069364161849711,
"acc_stderr": 0.02629622791561367,
"acc_norm": 0.6069364161849711,
"acc_norm_stderr": 0.02629622791561367
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5915032679738562,
"acc_stderr": 0.028146405993096358,
"acc_norm": 0.5915032679738562,
"acc_norm_stderr": 0.028146405993096358
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.572347266881029,
"acc_stderr": 0.028099240775809553,
"acc_norm": 0.572347266881029,
"acc_norm_stderr": 0.028099240775809553
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5802469135802469,
"acc_stderr": 0.027460099557005135,
"acc_norm": 0.5802469135802469,
"acc_norm_stderr": 0.027460099557005135
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.40070921985815605,
"acc_stderr": 0.02923346574557309,
"acc_norm": 0.40070921985815605,
"acc_norm_stderr": 0.02923346574557309
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4002607561929596,
"acc_stderr": 0.012513582529136216,
"acc_norm": 0.4002607561929596,
"acc_norm_stderr": 0.012513582529136216
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.48161764705882354,
"acc_stderr": 0.030352303395351964,
"acc_norm": 0.48161764705882354,
"acc_norm_stderr": 0.030352303395351964
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5016339869281046,
"acc_stderr": 0.020227726838150117,
"acc_norm": 0.5016339869281046,
"acc_norm_stderr": 0.020227726838150117
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5909090909090909,
"acc_stderr": 0.04709306978661896,
"acc_norm": 0.5909090909090909,
"acc_norm_stderr": 0.04709306978661896
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5918367346938775,
"acc_stderr": 0.03146465712827424,
"acc_norm": 0.5918367346938775,
"acc_norm_stderr": 0.03146465712827424
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7164179104477612,
"acc_stderr": 0.03187187537919797,
"acc_norm": 0.7164179104477612,
"acc_norm_stderr": 0.03187187537919797
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.78,
"acc_stderr": 0.041633319989322626,
"acc_norm": 0.78,
"acc_norm_stderr": 0.041633319989322626
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4036144578313253,
"acc_stderr": 0.03819486140758398,
"acc_norm": 0.4036144578313253,
"acc_norm_stderr": 0.03819486140758398
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7134502923976608,
"acc_stderr": 0.03467826685703826,
"acc_norm": 0.7134502923976608,
"acc_norm_stderr": 0.03467826685703826
},
"harness|truthfulqa:mc|0": {
"mc1": 0.29498164014687883,
"mc1_stderr": 0.015964400965589664,
"mc2": 0.450035556324201,
"mc2_stderr": 0.01480761962073364
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_migtissera__Synthia-13B-v1.2 | 2023-10-03T11:43:13.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of migtissera/Synthia-13B-v1.2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [migtissera/Synthia-13B-v1.2](https://huggingface.co/migtissera/Synthia-13B-v1.2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_migtissera__Synthia-13B-v1.2\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-03T11:41:50.925709](https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Synthia-13B-v1.2/blob/main/results_2023-10-03T11-41-50.925709.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5660497955475398,\n\
\ \"acc_stderr\": 0.03436189503917685,\n \"acc_norm\": 0.5700429681112433,\n\
\ \"acc_norm_stderr\": 0.03434042213083196,\n \"mc1\": 0.33659730722154224,\n\
\ \"mc1_stderr\": 0.01654241280949489,\n \"mc2\": 0.4727191424035362,\n\
\ \"mc2_stderr\": 0.015128415623267133\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5767918088737202,\n \"acc_stderr\": 0.014438036220848027,\n\
\ \"acc_norm\": 0.6126279863481229,\n \"acc_norm_stderr\": 0.01423587248790987\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.629555865365465,\n\
\ \"acc_stderr\": 0.004819367172685959,\n \"acc_norm\": 0.8293168691495718,\n\
\ \"acc_norm_stderr\": 0.0037546293132751604\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45925925925925926,\n\
\ \"acc_stderr\": 0.04304979692464243,\n \"acc_norm\": 0.45925925925925926,\n\
\ \"acc_norm_stderr\": 0.04304979692464243\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5460526315789473,\n \"acc_stderr\": 0.04051646342874142,\n\
\ \"acc_norm\": 0.5460526315789473,\n \"acc_norm_stderr\": 0.04051646342874142\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.04999999999999999,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.04999999999999999\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6226415094339622,\n \"acc_stderr\": 0.029832808114796,\n\
\ \"acc_norm\": 0.6226415094339622,\n \"acc_norm_stderr\": 0.029832808114796\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5763888888888888,\n\
\ \"acc_stderr\": 0.0413212501972337,\n \"acc_norm\": 0.5763888888888888,\n\
\ \"acc_norm_stderr\": 0.0413212501972337\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5086705202312138,\n\
\ \"acc_stderr\": 0.03811890988940412,\n \"acc_norm\": 0.5086705202312138,\n\
\ \"acc_norm_stderr\": 0.03811890988940412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.04488482852329017,\n\
\ \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.04488482852329017\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n\
\ \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4127659574468085,\n \"acc_stderr\": 0.03218471141400351,\n\
\ \"acc_norm\": 0.4127659574468085,\n \"acc_norm_stderr\": 0.03218471141400351\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3508771929824561,\n\
\ \"acc_stderr\": 0.044895393502706986,\n \"acc_norm\": 0.3508771929824561,\n\
\ \"acc_norm_stderr\": 0.044895393502706986\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728763,\n\
\ \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728763\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.335978835978836,\n \"acc_stderr\": 0.024326310529149138,\n \"\
acc_norm\": 0.335978835978836,\n \"acc_norm_stderr\": 0.024326310529149138\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.042857142857142816,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.042857142857142816\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6612903225806451,\n\
\ \"acc_stderr\": 0.026923446059302844,\n \"acc_norm\": 0.6612903225806451,\n\
\ \"acc_norm_stderr\": 0.026923446059302844\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4630541871921182,\n \"acc_stderr\": 0.03508370520442666,\n\
\ \"acc_norm\": 0.4630541871921182,\n \"acc_norm_stderr\": 0.03508370520442666\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237101,\n \"acc_norm\"\
: 0.59,\n \"acc_norm_stderr\": 0.04943110704237101\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6787878787878788,\n \"acc_stderr\": 0.0364620496325381,\n\
\ \"acc_norm\": 0.6787878787878788,\n \"acc_norm_stderr\": 0.0364620496325381\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7070707070707071,\n \"acc_stderr\": 0.032424979581788166,\n \"\
acc_norm\": 0.7070707070707071,\n \"acc_norm_stderr\": 0.032424979581788166\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8290155440414507,\n \"acc_stderr\": 0.027171213683164545,\n\
\ \"acc_norm\": 0.8290155440414507,\n \"acc_norm_stderr\": 0.027171213683164545\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5333333333333333,\n \"acc_stderr\": 0.025294608023986472,\n\
\ \"acc_norm\": 0.5333333333333333,\n \"acc_norm_stderr\": 0.025294608023986472\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.28888888888888886,\n \"acc_stderr\": 0.02763490726417854,\n \
\ \"acc_norm\": 0.28888888888888886,\n \"acc_norm_stderr\": 0.02763490726417854\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.032252942323996406,\n\
\ \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.032252942323996406\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.03757949922943342,\n \"acc_norm\"\
: 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943342\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7596330275229358,\n\
\ \"acc_stderr\": 0.01832060732096407,\n \"acc_norm\": 0.7596330275229358,\n\
\ \"acc_norm_stderr\": 0.01832060732096407\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.38425925925925924,\n \"acc_stderr\": 0.03317354514310742,\n\
\ \"acc_norm\": 0.38425925925925924,\n \"acc_norm_stderr\": 0.03317354514310742\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7549019607843137,\n \"acc_stderr\": 0.03019028245350195,\n \"\
acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.03019028245350195\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229966,\n \
\ \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229966\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6636771300448431,\n\
\ \"acc_stderr\": 0.031708824268455005,\n \"acc_norm\": 0.6636771300448431,\n\
\ \"acc_norm_stderr\": 0.031708824268455005\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5877862595419847,\n \"acc_stderr\": 0.04317171194870254,\n\
\ \"acc_norm\": 0.5877862595419847,\n \"acc_norm_stderr\": 0.04317171194870254\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7355371900826446,\n \"acc_stderr\": 0.040261875275912073,\n \"\
acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.040261875275912073\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.04236511258094633,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.04236511258094633\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6748466257668712,\n \"acc_stderr\": 0.03680350371286461,\n\
\ \"acc_norm\": 0.6748466257668712,\n \"acc_norm_stderr\": 0.03680350371286461\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.32142857142857145,\n\
\ \"acc_stderr\": 0.04432804055291519,\n \"acc_norm\": 0.32142857142857145,\n\
\ \"acc_norm_stderr\": 0.04432804055291519\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.041858325989283136,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.041858325989283136\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.024414947304543678,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.024414947304543678\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7713920817369093,\n\
\ \"acc_stderr\": 0.015016884698539871,\n \"acc_norm\": 0.7713920817369093,\n\
\ \"acc_norm_stderr\": 0.015016884698539871\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6445086705202312,\n \"acc_stderr\": 0.025770292082977243,\n\
\ \"acc_norm\": 0.6445086705202312,\n \"acc_norm_stderr\": 0.025770292082977243\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4670391061452514,\n\
\ \"acc_stderr\": 0.016686126653013937,\n \"acc_norm\": 0.4670391061452514,\n\
\ \"acc_norm_stderr\": 0.016686126653013937\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5915032679738562,\n \"acc_stderr\": 0.028146405993096358,\n\
\ \"acc_norm\": 0.5915032679738562,\n \"acc_norm_stderr\": 0.028146405993096358\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6334405144694534,\n\
\ \"acc_stderr\": 0.02736807824397164,\n \"acc_norm\": 0.6334405144694534,\n\
\ \"acc_norm_stderr\": 0.02736807824397164\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6358024691358025,\n \"acc_stderr\": 0.026774929899722334,\n\
\ \"acc_norm\": 0.6358024691358025,\n \"acc_norm_stderr\": 0.026774929899722334\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4432624113475177,\n \"acc_stderr\": 0.029634838473766002,\n \
\ \"acc_norm\": 0.4432624113475177,\n \"acc_norm_stderr\": 0.029634838473766002\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41395045632333766,\n\
\ \"acc_stderr\": 0.012579699631289264,\n \"acc_norm\": 0.41395045632333766,\n\
\ \"acc_norm_stderr\": 0.012579699631289264\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5477941176470589,\n \"acc_stderr\": 0.03023375855159644,\n\
\ \"acc_norm\": 0.5477941176470589,\n \"acc_norm_stderr\": 0.03023375855159644\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5620915032679739,\n \"acc_stderr\": 0.020071257886886525,\n \
\ \"acc_norm\": 0.5620915032679739,\n \"acc_norm_stderr\": 0.020071257886886525\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6326530612244898,\n \"acc_stderr\": 0.03086214492108756,\n\
\ \"acc_norm\": 0.6326530612244898,\n \"acc_norm_stderr\": 0.03086214492108756\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7313432835820896,\n\
\ \"acc_stderr\": 0.03134328358208954,\n \"acc_norm\": 0.7313432835820896,\n\
\ \"acc_norm_stderr\": 0.03134328358208954\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.43373493975903615,\n\
\ \"acc_stderr\": 0.03858158940685517,\n \"acc_norm\": 0.43373493975903615,\n\
\ \"acc_norm_stderr\": 0.03858158940685517\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03188578017686398,\n\
\ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03188578017686398\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.33659730722154224,\n\
\ \"mc1_stderr\": 0.01654241280949489,\n \"mc2\": 0.4727191424035362,\n\
\ \"mc2_stderr\": 0.015128415623267133\n }\n}\n```"
repo_url: https://huggingface.co/migtissera/Synthia-13B-v1.2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_03T11_41_50.925709
path:
- '**/details_harness|arc:challenge|25_2023-10-03T11-41-50.925709.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-03T11-41-50.925709.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_03T11_41_50.925709
path:
- '**/details_harness|hellaswag|10_2023-10-03T11-41-50.925709.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-03T11-41-50.925709.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_03T11_41_50.925709
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T11-41-50.925709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T11-41-50.925709.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T11-41-50.925709.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_03T11_41_50.925709
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T11-41-50.925709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T11-41-50.925709.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_03T11_41_50.925709
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T11-41-50.925709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T11-41-50.925709.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_03T11_41_50.925709
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T11-41-50.925709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T11-41-50.925709.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_03T11_41_50.925709
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T11-41-50.925709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T11-41-50.925709.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_03T11_41_50.925709
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T11-41-50.925709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T11-41-50.925709.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_03T11_41_50.925709
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T11-41-50.925709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T11-41-50.925709.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_03T11_41_50.925709
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T11-41-50.925709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T11-41-50.925709.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_03T11_41_50.925709
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T11-41-50.925709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T11-41-50.925709.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_03T11_41_50.925709
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T11-41-50.925709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T11-41-50.925709.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_03T11_41_50.925709
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T11-41-50.925709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T11-41-50.925709.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_03T11_41_50.925709
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T11-41-50.925709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T11-41-50.925709.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_03T11_41_50.925709
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T11-41-50.925709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T11-41-50.925709.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_03T11_41_50.925709
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T11-41-50.925709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T11-41-50.925709.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_03T11_41_50.925709
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T11-41-50.925709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T11-41-50.925709.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_03T11_41_50.925709
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T11-41-50.925709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T11-41-50.925709.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_03T11_41_50.925709
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T11-41-50.925709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T11-41-50.925709.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_03T11_41_50.925709
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T11-41-50.925709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T11-41-50.925709.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_03T11_41_50.925709
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T11-41-50.925709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T11-41-50.925709.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_03T11_41_50.925709
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T11-41-50.925709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T11-41-50.925709.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_03T11_41_50.925709
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T11-41-50.925709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T11-41-50.925709.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_03T11_41_50.925709
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T11-41-50.925709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T11-41-50.925709.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_03T11_41_50.925709
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T11-41-50.925709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T11-41-50.925709.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_03T11_41_50.925709
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T11-41-50.925709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T11-41-50.925709.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_03T11_41_50.925709
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T11-41-50.925709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T11-41-50.925709.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_03T11_41_50.925709
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T11-41-50.925709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T11-41-50.925709.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_03T11_41_50.925709
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T11-41-50.925709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T11-41-50.925709.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_03T11_41_50.925709
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T11-41-50.925709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T11-41-50.925709.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_03T11_41_50.925709
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T11-41-50.925709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T11-41-50.925709.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_03T11_41_50.925709
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T11-41-50.925709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T11-41-50.925709.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_03T11_41_50.925709
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T11-41-50.925709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T11-41-50.925709.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_03T11_41_50.925709
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T11-41-50.925709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T11-41-50.925709.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_03T11_41_50.925709
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T11-41-50.925709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T11-41-50.925709.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_03T11_41_50.925709
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T11-41-50.925709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T11-41-50.925709.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_03T11_41_50.925709
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T11-41-50.925709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T11-41-50.925709.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_03T11_41_50.925709
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T11-41-50.925709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T11-41-50.925709.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_03T11_41_50.925709
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T11-41-50.925709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T11-41-50.925709.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_03T11_41_50.925709
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T11-41-50.925709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T11-41-50.925709.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_03T11_41_50.925709
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T11-41-50.925709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T11-41-50.925709.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_03T11_41_50.925709
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T11-41-50.925709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T11-41-50.925709.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_03T11_41_50.925709
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T11-41-50.925709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T11-41-50.925709.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_03T11_41_50.925709
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T11-41-50.925709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T11-41-50.925709.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_03T11_41_50.925709
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T11-41-50.925709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T11-41-50.925709.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_03T11_41_50.925709
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T11-41-50.925709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T11-41-50.925709.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_03T11_41_50.925709
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T11-41-50.925709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T11-41-50.925709.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_03T11_41_50.925709
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T11-41-50.925709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T11-41-50.925709.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_03T11_41_50.925709
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T11-41-50.925709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T11-41-50.925709.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_03T11_41_50.925709
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T11-41-50.925709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T11-41-50.925709.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_03T11_41_50.925709
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T11-41-50.925709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T11-41-50.925709.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_03T11_41_50.925709
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T11-41-50.925709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T11-41-50.925709.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_03T11_41_50.925709
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T11-41-50.925709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T11-41-50.925709.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_03T11_41_50.925709
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T11-41-50.925709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T11-41-50.925709.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_03T11_41_50.925709
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T11-41-50.925709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T11-41-50.925709.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_03T11_41_50.925709
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T11-41-50.925709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T11-41-50.925709.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_03T11_41_50.925709
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T11-41-50.925709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T11-41-50.925709.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_03T11_41_50.925709
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T11-41-50.925709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T11-41-50.925709.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_03T11_41_50.925709
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T11-41-50.925709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T11-41-50.925709.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_03T11_41_50.925709
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T11-41-50.925709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T11-41-50.925709.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_03T11_41_50.925709
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T11-41-50.925709.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T11-41-50.925709.parquet'
- config_name: results
data_files:
- split: 2023_10_03T11_41_50.925709
path:
- results_2023-10-03T11-41-50.925709.parquet
- split: latest
path:
- results_2023-10-03T11-41-50.925709.parquet
---
# Dataset Card for Evaluation run of migtissera/Synthia-13B-v1.2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/migtissera/Synthia-13B-v1.2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [migtissera/Synthia-13B-v1.2](https://huggingface.co/migtissera/Synthia-13B-v1.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_migtissera__Synthia-13B-v1.2",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-03T11:41:50.925709](https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Synthia-13B-v1.2/blob/main/results_2023-10-03T11-41-50.925709.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5660497955475398,
"acc_stderr": 0.03436189503917685,
"acc_norm": 0.5700429681112433,
"acc_norm_stderr": 0.03434042213083196,
"mc1": 0.33659730722154224,
"mc1_stderr": 0.01654241280949489,
"mc2": 0.4727191424035362,
"mc2_stderr": 0.015128415623267133
},
"harness|arc:challenge|25": {
"acc": 0.5767918088737202,
"acc_stderr": 0.014438036220848027,
"acc_norm": 0.6126279863481229,
"acc_norm_stderr": 0.01423587248790987
},
"harness|hellaswag|10": {
"acc": 0.629555865365465,
"acc_stderr": 0.004819367172685959,
"acc_norm": 0.8293168691495718,
"acc_norm_stderr": 0.0037546293132751604
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45925925925925926,
"acc_stderr": 0.04304979692464243,
"acc_norm": 0.45925925925925926,
"acc_norm_stderr": 0.04304979692464243
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5460526315789473,
"acc_stderr": 0.04051646342874142,
"acc_norm": 0.5460526315789473,
"acc_norm_stderr": 0.04051646342874142
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.55,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6226415094339622,
"acc_stderr": 0.029832808114796,
"acc_norm": 0.6226415094339622,
"acc_norm_stderr": 0.029832808114796
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5763888888888888,
"acc_stderr": 0.0413212501972337,
"acc_norm": 0.5763888888888888,
"acc_norm_stderr": 0.0413212501972337
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5086705202312138,
"acc_stderr": 0.03811890988940412,
"acc_norm": 0.5086705202312138,
"acc_norm_stderr": 0.03811890988940412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.04488482852329017,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.04488482852329017
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4127659574468085,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.4127659574468085,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3508771929824561,
"acc_stderr": 0.044895393502706986,
"acc_norm": 0.3508771929824561,
"acc_norm_stderr": 0.044895393502706986
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.335978835978836,
"acc_stderr": 0.024326310529149138,
"acc_norm": 0.335978835978836,
"acc_norm_stderr": 0.024326310529149138
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.042857142857142816,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.042857142857142816
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6612903225806451,
"acc_stderr": 0.026923446059302844,
"acc_norm": 0.6612903225806451,
"acc_norm_stderr": 0.026923446059302844
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4630541871921182,
"acc_stderr": 0.03508370520442666,
"acc_norm": 0.4630541871921182,
"acc_norm_stderr": 0.03508370520442666
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237101,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237101
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6787878787878788,
"acc_stderr": 0.0364620496325381,
"acc_norm": 0.6787878787878788,
"acc_norm_stderr": 0.0364620496325381
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7070707070707071,
"acc_stderr": 0.032424979581788166,
"acc_norm": 0.7070707070707071,
"acc_norm_stderr": 0.032424979581788166
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8290155440414507,
"acc_stderr": 0.027171213683164545,
"acc_norm": 0.8290155440414507,
"acc_norm_stderr": 0.027171213683164545
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5333333333333333,
"acc_stderr": 0.025294608023986472,
"acc_norm": 0.5333333333333333,
"acc_norm_stderr": 0.025294608023986472
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.28888888888888886,
"acc_stderr": 0.02763490726417854,
"acc_norm": 0.28888888888888886,
"acc_norm_stderr": 0.02763490726417854
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.032252942323996406,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.032252942323996406
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943342,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943342
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7596330275229358,
"acc_stderr": 0.01832060732096407,
"acc_norm": 0.7596330275229358,
"acc_norm_stderr": 0.01832060732096407
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.38425925925925924,
"acc_stderr": 0.03317354514310742,
"acc_norm": 0.38425925925925924,
"acc_norm_stderr": 0.03317354514310742
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.03019028245350195,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.03019028245350195
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7763713080168776,
"acc_stderr": 0.027123298205229966,
"acc_norm": 0.7763713080168776,
"acc_norm_stderr": 0.027123298205229966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6636771300448431,
"acc_stderr": 0.031708824268455005,
"acc_norm": 0.6636771300448431,
"acc_norm_stderr": 0.031708824268455005
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5877862595419847,
"acc_stderr": 0.04317171194870254,
"acc_norm": 0.5877862595419847,
"acc_norm_stderr": 0.04317171194870254
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.040261875275912073,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.040261875275912073
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094633,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094633
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6748466257668712,
"acc_stderr": 0.03680350371286461,
"acc_norm": 0.6748466257668712,
"acc_norm_stderr": 0.03680350371286461
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.32142857142857145,
"acc_stderr": 0.04432804055291519,
"acc_norm": 0.32142857142857145,
"acc_norm_stderr": 0.04432804055291519
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.041858325989283136,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.041858325989283136
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.024414947304543678,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.024414947304543678
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7713920817369093,
"acc_stderr": 0.015016884698539871,
"acc_norm": 0.7713920817369093,
"acc_norm_stderr": 0.015016884698539871
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6445086705202312,
"acc_stderr": 0.025770292082977243,
"acc_norm": 0.6445086705202312,
"acc_norm_stderr": 0.025770292082977243
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4670391061452514,
"acc_stderr": 0.016686126653013937,
"acc_norm": 0.4670391061452514,
"acc_norm_stderr": 0.016686126653013937
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5915032679738562,
"acc_stderr": 0.028146405993096358,
"acc_norm": 0.5915032679738562,
"acc_norm_stderr": 0.028146405993096358
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6334405144694534,
"acc_stderr": 0.02736807824397164,
"acc_norm": 0.6334405144694534,
"acc_norm_stderr": 0.02736807824397164
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6358024691358025,
"acc_stderr": 0.026774929899722334,
"acc_norm": 0.6358024691358025,
"acc_norm_stderr": 0.026774929899722334
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4432624113475177,
"acc_stderr": 0.029634838473766002,
"acc_norm": 0.4432624113475177,
"acc_norm_stderr": 0.029634838473766002
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.41395045632333766,
"acc_stderr": 0.012579699631289264,
"acc_norm": 0.41395045632333766,
"acc_norm_stderr": 0.012579699631289264
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5477941176470589,
"acc_stderr": 0.03023375855159644,
"acc_norm": 0.5477941176470589,
"acc_norm_stderr": 0.03023375855159644
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5620915032679739,
"acc_stderr": 0.020071257886886525,
"acc_norm": 0.5620915032679739,
"acc_norm_stderr": 0.020071257886886525
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6326530612244898,
"acc_stderr": 0.03086214492108756,
"acc_norm": 0.6326530612244898,
"acc_norm_stderr": 0.03086214492108756
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7313432835820896,
"acc_stderr": 0.03134328358208954,
"acc_norm": 0.7313432835820896,
"acc_norm_stderr": 0.03134328358208954
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-virology|5": {
"acc": 0.43373493975903615,
"acc_stderr": 0.03858158940685517,
"acc_norm": 0.43373493975903615,
"acc_norm_stderr": 0.03858158940685517
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03188578017686398,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03188578017686398
},
"harness|truthfulqa:mc|0": {
"mc1": 0.33659730722154224,
"mc1_stderr": 0.01654241280949489,
"mc2": 0.4727191424035362,
"mc2_stderr": 0.015128415623267133
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_boomerchan__magpie-13b | 2023-10-03T11:50:11.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of boomerchan/magpie-13b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [boomerchan/magpie-13b](https://huggingface.co/boomerchan/magpie-13b) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_boomerchan__magpie-13b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-03T11:48:49.581129](https://huggingface.co/datasets/open-llm-leaderboard/details_boomerchan__magpie-13b/blob/main/results_2023-10-03T11-48-49.581129.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5827073934681583,\n\
\ \"acc_stderr\": 0.034048991446061445,\n \"acc_norm\": 0.5867699973335664,\n\
\ \"acc_norm_stderr\": 0.03402506673865785,\n \"mc1\": 0.34394124847001223,\n\
\ \"mc1_stderr\": 0.01662908751427678,\n \"mc2\": 0.49146975171261703,\n\
\ \"mc2_stderr\": 0.015182175866066504\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5955631399317406,\n \"acc_stderr\": 0.014342036483436175,\n\
\ \"acc_norm\": 0.6331058020477816,\n \"acc_norm_stderr\": 0.014084133118104298\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6403106950806612,\n\
\ \"acc_stderr\": 0.004789284723955857,\n \"acc_norm\": 0.8424616610237005,\n\
\ \"acc_norm_stderr\": 0.0036356303524759065\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5723684210526315,\n \"acc_stderr\": 0.04026097083296564,\n\
\ \"acc_norm\": 0.5723684210526315,\n \"acc_norm_stderr\": 0.04026097083296564\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6075471698113207,\n \"acc_stderr\": 0.030052580579557845,\n\
\ \"acc_norm\": 0.6075471698113207,\n \"acc_norm_stderr\": 0.030052580579557845\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6319444444444444,\n\
\ \"acc_stderr\": 0.04032999053960718,\n \"acc_norm\": 0.6319444444444444,\n\
\ \"acc_norm_stderr\": 0.04032999053960718\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5780346820809249,\n\
\ \"acc_stderr\": 0.037657466938651504,\n \"acc_norm\": 0.5780346820809249,\n\
\ \"acc_norm_stderr\": 0.037657466938651504\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.04389869956808777,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.04389869956808777\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.46808510638297873,\n \"acc_stderr\": 0.03261936918467382,\n\
\ \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.03261936918467382\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3157894736842105,\n\
\ \"acc_stderr\": 0.043727482902780064,\n \"acc_norm\": 0.3157894736842105,\n\
\ \"acc_norm_stderr\": 0.043727482902780064\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.496551724137931,\n \"acc_stderr\": 0.041665675771015785,\n\
\ \"acc_norm\": 0.496551724137931,\n \"acc_norm_stderr\": 0.041665675771015785\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3306878306878307,\n \"acc_stderr\": 0.024229965298425072,\n \"\
acc_norm\": 0.3306878306878307,\n \"acc_norm_stderr\": 0.024229965298425072\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n\
\ \"acc_stderr\": 0.043062412591271526,\n \"acc_norm\": 0.36507936507936506,\n\
\ \"acc_norm_stderr\": 0.043062412591271526\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6580645161290323,\n \"acc_stderr\": 0.026985289576552746,\n \"\
acc_norm\": 0.6580645161290323,\n \"acc_norm_stderr\": 0.026985289576552746\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.45320197044334976,\n \"acc_stderr\": 0.035025446508458714,\n \"\
acc_norm\": 0.45320197044334976,\n \"acc_norm_stderr\": 0.035025446508458714\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\"\
: 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6848484848484848,\n \"acc_stderr\": 0.0362773057502241,\n\
\ \"acc_norm\": 0.6848484848484848,\n \"acc_norm_stderr\": 0.0362773057502241\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7474747474747475,\n \"acc_stderr\": 0.030954055470365907,\n \"\
acc_norm\": 0.7474747474747475,\n \"acc_norm_stderr\": 0.030954055470365907\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8290155440414507,\n \"acc_stderr\": 0.02717121368316453,\n\
\ \"acc_norm\": 0.8290155440414507,\n \"acc_norm_stderr\": 0.02717121368316453\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5358974358974359,\n \"acc_stderr\": 0.025285585990017848,\n\
\ \"acc_norm\": 0.5358974358974359,\n \"acc_norm_stderr\": 0.025285585990017848\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32592592592592595,\n \"acc_stderr\": 0.028578348365473072,\n \
\ \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.028578348365473072\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6134453781512605,\n \"acc_stderr\": 0.0316314580755238,\n \
\ \"acc_norm\": 0.6134453781512605,\n \"acc_norm_stderr\": 0.0316314580755238\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7743119266055046,\n \"acc_stderr\": 0.017923087667803064,\n \"\
acc_norm\": 0.7743119266055046,\n \"acc_norm_stderr\": 0.017923087667803064\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4537037037037037,\n \"acc_stderr\": 0.03395322726375797,\n \"\
acc_norm\": 0.4537037037037037,\n \"acc_norm_stderr\": 0.03395322726375797\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7843137254901961,\n \"acc_stderr\": 0.028867431449849313,\n \"\
acc_norm\": 0.7843137254901961,\n \"acc_norm_stderr\": 0.028867431449849313\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7552742616033755,\n \"acc_stderr\": 0.02798569938703642,\n \
\ \"acc_norm\": 0.7552742616033755,\n \"acc_norm_stderr\": 0.02798569938703642\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n\
\ \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n\
\ \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.648854961832061,\n \"acc_stderr\": 0.04186445163013751,\n\
\ \"acc_norm\": 0.648854961832061,\n \"acc_norm_stderr\": 0.04186445163013751\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302873,\n \"acc_norm\"\
: 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302873\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6993865030674846,\n \"acc_stderr\": 0.03602511318806771,\n\
\ \"acc_norm\": 0.6993865030674846,\n \"acc_norm_stderr\": 0.03602511318806771\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n\
\ \"acc_stderr\": 0.04464285714285713,\n \"acc_norm\": 0.33035714285714285,\n\
\ \"acc_norm_stderr\": 0.04464285714285713\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8034188034188035,\n\
\ \"acc_stderr\": 0.02603538609895129,\n \"acc_norm\": 0.8034188034188035,\n\
\ \"acc_norm_stderr\": 0.02603538609895129\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7752234993614304,\n\
\ \"acc_stderr\": 0.014927447101937148,\n \"acc_norm\": 0.7752234993614304,\n\
\ \"acc_norm_stderr\": 0.014927447101937148\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6560693641618497,\n \"acc_stderr\": 0.02557412378654666,\n\
\ \"acc_norm\": 0.6560693641618497,\n \"acc_norm_stderr\": 0.02557412378654666\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4670391061452514,\n\
\ \"acc_stderr\": 0.016686126653013934,\n \"acc_norm\": 0.4670391061452514,\n\
\ \"acc_norm_stderr\": 0.016686126653013934\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6437908496732027,\n \"acc_stderr\": 0.027420477662629235,\n\
\ \"acc_norm\": 0.6437908496732027,\n \"acc_norm_stderr\": 0.027420477662629235\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6495176848874598,\n\
\ \"acc_stderr\": 0.027098652621301754,\n \"acc_norm\": 0.6495176848874598,\n\
\ \"acc_norm_stderr\": 0.027098652621301754\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6604938271604939,\n \"acc_stderr\": 0.026348564412011624,\n\
\ \"acc_norm\": 0.6604938271604939,\n \"acc_norm_stderr\": 0.026348564412011624\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.44680851063829785,\n \"acc_stderr\": 0.029658235097666904,\n \
\ \"acc_norm\": 0.44680851063829785,\n \"acc_norm_stderr\": 0.029658235097666904\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44132985658409385,\n\
\ \"acc_stderr\": 0.01268201633564667,\n \"acc_norm\": 0.44132985658409385,\n\
\ \"acc_norm_stderr\": 0.01268201633564667\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.625,\n \"acc_stderr\": 0.029408372932278746,\n \
\ \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.029408372932278746\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5915032679738562,\n \"acc_stderr\": 0.019886221037501862,\n \
\ \"acc_norm\": 0.5915032679738562,\n \"acc_norm_stderr\": 0.019886221037501862\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n\
\ \"acc_stderr\": 0.046534298079135075,\n \"acc_norm\": 0.6181818181818182,\n\
\ \"acc_norm_stderr\": 0.046534298079135075\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6489795918367347,\n \"acc_stderr\": 0.030555316755573637,\n\
\ \"acc_norm\": 0.6489795918367347,\n \"acc_norm_stderr\": 0.030555316755573637\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7810945273631841,\n\
\ \"acc_stderr\": 0.029239174636647,\n \"acc_norm\": 0.7810945273631841,\n\
\ \"acc_norm_stderr\": 0.029239174636647\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4879518072289157,\n\
\ \"acc_stderr\": 0.03891364495835821,\n \"acc_norm\": 0.4879518072289157,\n\
\ \"acc_norm_stderr\": 0.03891364495835821\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.031581495393387324,\n\
\ \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.031581495393387324\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.34394124847001223,\n\
\ \"mc1_stderr\": 0.01662908751427678,\n \"mc2\": 0.49146975171261703,\n\
\ \"mc2_stderr\": 0.015182175866066504\n }\n}\n```"
repo_url: https://huggingface.co/boomerchan/magpie-13b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|arc:challenge|25_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hellaswag|10_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T11-48-49.581129.parquet'
- config_name: results
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- results_2023-10-03T11-48-49.581129.parquet
- split: latest
path:
- results_2023-10-03T11-48-49.581129.parquet
---
# Dataset Card for Evaluation run of boomerchan/magpie-13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/boomerchan/magpie-13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [boomerchan/magpie-13b](https://huggingface.co/boomerchan/magpie-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_boomerchan__magpie-13b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-03T11:48:49.581129](https://huggingface.co/datasets/open-llm-leaderboard/details_boomerchan__magpie-13b/blob/main/results_2023-10-03T11-48-49.581129.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5827073934681583,
"acc_stderr": 0.034048991446061445,
"acc_norm": 0.5867699973335664,
"acc_norm_stderr": 0.03402506673865785,
"mc1": 0.34394124847001223,
"mc1_stderr": 0.01662908751427678,
"mc2": 0.49146975171261703,
"mc2_stderr": 0.015182175866066504
},
"harness|arc:challenge|25": {
"acc": 0.5955631399317406,
"acc_stderr": 0.014342036483436175,
"acc_norm": 0.6331058020477816,
"acc_norm_stderr": 0.014084133118104298
},
"harness|hellaswag|10": {
"acc": 0.6403106950806612,
"acc_stderr": 0.004789284723955857,
"acc_norm": 0.8424616610237005,
"acc_norm_stderr": 0.0036356303524759065
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5723684210526315,
"acc_stderr": 0.04026097083296564,
"acc_norm": 0.5723684210526315,
"acc_norm_stderr": 0.04026097083296564
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6075471698113207,
"acc_stderr": 0.030052580579557845,
"acc_norm": 0.6075471698113207,
"acc_norm_stderr": 0.030052580579557845
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6319444444444444,
"acc_stderr": 0.04032999053960718,
"acc_norm": 0.6319444444444444,
"acc_norm_stderr": 0.04032999053960718
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5780346820809249,
"acc_stderr": 0.037657466938651504,
"acc_norm": 0.5780346820809249,
"acc_norm_stderr": 0.037657466938651504
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.04389869956808777,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.04389869956808777
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.03261936918467382,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.03261936918467382
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.043727482902780064,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.043727482902780064
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.496551724137931,
"acc_stderr": 0.041665675771015785,
"acc_norm": 0.496551724137931,
"acc_norm_stderr": 0.041665675771015785
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3306878306878307,
"acc_stderr": 0.024229965298425072,
"acc_norm": 0.3306878306878307,
"acc_norm_stderr": 0.024229965298425072
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.043062412591271526,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.043062412591271526
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6580645161290323,
"acc_stderr": 0.026985289576552746,
"acc_norm": 0.6580645161290323,
"acc_norm_stderr": 0.026985289576552746
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.45320197044334976,
"acc_stderr": 0.035025446508458714,
"acc_norm": 0.45320197044334976,
"acc_norm_stderr": 0.035025446508458714
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6848484848484848,
"acc_stderr": 0.0362773057502241,
"acc_norm": 0.6848484848484848,
"acc_norm_stderr": 0.0362773057502241
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7474747474747475,
"acc_stderr": 0.030954055470365907,
"acc_norm": 0.7474747474747475,
"acc_norm_stderr": 0.030954055470365907
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8290155440414507,
"acc_stderr": 0.02717121368316453,
"acc_norm": 0.8290155440414507,
"acc_norm_stderr": 0.02717121368316453
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5358974358974359,
"acc_stderr": 0.025285585990017848,
"acc_norm": 0.5358974358974359,
"acc_norm_stderr": 0.025285585990017848
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.028578348365473072,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.028578348365473072
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6134453781512605,
"acc_stderr": 0.0316314580755238,
"acc_norm": 0.6134453781512605,
"acc_norm_stderr": 0.0316314580755238
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7743119266055046,
"acc_stderr": 0.017923087667803064,
"acc_norm": 0.7743119266055046,
"acc_norm_stderr": 0.017923087667803064
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4537037037037037,
"acc_stderr": 0.03395322726375797,
"acc_norm": 0.4537037037037037,
"acc_norm_stderr": 0.03395322726375797
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7843137254901961,
"acc_stderr": 0.028867431449849313,
"acc_norm": 0.7843137254901961,
"acc_norm_stderr": 0.028867431449849313
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7552742616033755,
"acc_stderr": 0.02798569938703642,
"acc_norm": 0.7552742616033755,
"acc_norm_stderr": 0.02798569938703642
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.648854961832061,
"acc_stderr": 0.04186445163013751,
"acc_norm": 0.648854961832061,
"acc_norm_stderr": 0.04186445163013751
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302873,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302873
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6993865030674846,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.6993865030674846,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.33035714285714285,
"acc_stderr": 0.04464285714285713,
"acc_norm": 0.33035714285714285,
"acc_norm_stderr": 0.04464285714285713
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8034188034188035,
"acc_stderr": 0.02603538609895129,
"acc_norm": 0.8034188034188035,
"acc_norm_stderr": 0.02603538609895129
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7752234993614304,
"acc_stderr": 0.014927447101937148,
"acc_norm": 0.7752234993614304,
"acc_norm_stderr": 0.014927447101937148
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6560693641618497,
"acc_stderr": 0.02557412378654666,
"acc_norm": 0.6560693641618497,
"acc_norm_stderr": 0.02557412378654666
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4670391061452514,
"acc_stderr": 0.016686126653013934,
"acc_norm": 0.4670391061452514,
"acc_norm_stderr": 0.016686126653013934
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6437908496732027,
"acc_stderr": 0.027420477662629235,
"acc_norm": 0.6437908496732027,
"acc_norm_stderr": 0.027420477662629235
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6495176848874598,
"acc_stderr": 0.027098652621301754,
"acc_norm": 0.6495176848874598,
"acc_norm_stderr": 0.027098652621301754
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6604938271604939,
"acc_stderr": 0.026348564412011624,
"acc_norm": 0.6604938271604939,
"acc_norm_stderr": 0.026348564412011624
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.44680851063829785,
"acc_stderr": 0.029658235097666904,
"acc_norm": 0.44680851063829785,
"acc_norm_stderr": 0.029658235097666904
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44132985658409385,
"acc_stderr": 0.01268201633564667,
"acc_norm": 0.44132985658409385,
"acc_norm_stderr": 0.01268201633564667
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.625,
"acc_stderr": 0.029408372932278746,
"acc_norm": 0.625,
"acc_norm_stderr": 0.029408372932278746
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5915032679738562,
"acc_stderr": 0.019886221037501862,
"acc_norm": 0.5915032679738562,
"acc_norm_stderr": 0.019886221037501862
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.046534298079135075,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.046534298079135075
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6489795918367347,
"acc_stderr": 0.030555316755573637,
"acc_norm": 0.6489795918367347,
"acc_norm_stderr": 0.030555316755573637
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7810945273631841,
"acc_stderr": 0.029239174636647,
"acc_norm": 0.7810945273631841,
"acc_norm_stderr": 0.029239174636647
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197769,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197769
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4879518072289157,
"acc_stderr": 0.03891364495835821,
"acc_norm": 0.4879518072289157,
"acc_norm_stderr": 0.03891364495835821
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.783625730994152,
"acc_stderr": 0.031581495393387324,
"acc_norm": 0.783625730994152,
"acc_norm_stderr": 0.031581495393387324
},
"harness|truthfulqa:mc|0": {
"mc1": 0.34394124847001223,
"mc1_stderr": 0.01662908751427678,
"mc2": 0.49146975171261703,
"mc2_stderr": 0.015182175866066504
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
winterForestStump/10k_edgar_sec_filings | 2023-10-03T11:53:13.000Z | [
"region:us"
] | winterForestStump | null | null | null | 0 | 0 | Entry not found |
atom-in-the-universe/bild-f8c13b94-be88-499b-946b-65c047f82d91 | 2023-10-03T12:03:09.000Z | [
"region:us"
] | atom-in-the-universe | null | null | null | 0 | 0 | Entry not found |
Sandeep81/llama2formated | 2023-10-03T12:19:54.000Z | [
"region:us"
] | Sandeep81 | null | null | null | 0 | 0 | Entry not found |
atom-in-the-universe/bild-83f941ac-2317-4349-955a-4cb3c2c162c8 | 2023-10-03T12:20:02.000Z | [
"region:us"
] | atom-in-the-universe | null | null | null | 0 | 0 | Entry not found |
Mxode/CSDN-Community-C-Language-3years | 2023-10-03T12:36:23.000Z | [
"task_categories:question-answering",
"task_categories:conversational",
"task_categories:text-generation",
"size_categories:1K<n<10K",
"language:zh",
"license:lgpl",
"code",
"region:us"
] | Mxode | null | null | null | 0 | 0 | ---
license: lgpl
task_categories:
- question-answering
- conversational
- text-generation
language:
- zh
tags:
- code
size_categories:
- 1K<n<10K
---
CSDN - C 语言社区 **2020.10.2 ~ 2023.10.2** 的问答数据,未包含图片,仅有文本内容。
共 **2380** 条,数据已经经过**初步清洗和脱敏**,去除了所有 0 回复的贴子 & 机器人回复的贴子。为了方便不同使用目的,按照回复盖楼的格式对数据进行了组织,一个样例(展开后)如下:
```json
{
"question": "刚学C语言,为什么这个代码运行不了呢",
"poster": "user-0",
"comments": [
{
"cid": "2",
"user": "user-2",
"content": "intunsigned intlong longunsigned long long统统容纳不下29的阶乘,早就溢出了。",
"referer": "user-0"
},
{
"cid": "3",
"user": "user-3",
"content": "#include <stdio.h> #include <math.h> int main(void) { int i = 1; long long sum = 1; // 使用 long long 类型来存储阶乘结果 int x; printf(\"请输入一个非负整数: \"); if (scanf(\"%d\", &x) != 1 || x < 0) { printf(\"输入无效,请输入一个非负整数。\\n\"); return 1; // 返回错误码 } while (i <= x) { sum *= i; i++; } printf(\"%d 的阶乘是 %lld\\n\", x, sum); return 0; }",
"referer": "user-0"
}
]
}
```
`user` 和 `referer` 做了脱敏映射,但保留了回复的逻辑关系(即保留了回复楼主 & 楼中楼回复的逻辑关系)。
`question` 和 `comment` 都按照单行形式进行了组织,无需额外处理。
由于部分回答较长,出于可能的长文需要,因此没有对数据进行修剪,数据具体的分位点如下,请按需修剪:
```
question comments
count 2380.000000 2380.000000
mean 22.074370 1528.050840
std 14.986499 2608.022392
min 4.000000 69.000000
10% 7.900000 160.900000
20% 12.000000 235.800000
30% 14.000000 342.000000
40% 16.000000 469.000000
50% 18.000000 648.500000
60% 21.000000 889.000000
70% 25.000000 1234.300000
75% 27.000000 1542.500000
80% 30.000000 1990.400000
85% 34.000000 2665.800000
90% 40.000000 3810.800000
95% 51.000000 6008.050000
max 130.000000 30606.000000
``` |
atom-in-the-universe/bild-fa8cc18e-138d-4f86-9724-53d2381e5a8a | 2023-10-03T12:31:31.000Z | [
"region:us"
] | atom-in-the-universe | null | null | null | 0 | 0 | Entry not found |
atom-in-the-universe/bild-6c07d29f-478a-4cc0-95e0-591ee30716f8 | 2023-10-03T12:40:33.000Z | [
"region:us"
] | atom-in-the-universe | null | null | null | 0 | 0 | Entry not found |
atom-in-the-universe/bild-f46b7c73-1733-4944-ad8b-434c91626757 | 2023-10-03T12:45:22.000Z | [
"region:us"
] | atom-in-the-universe | null | null | null | 0 | 0 | Entry not found |
ai202388/k_sd | 2023-10-10T10:44:08.000Z | [
"region:us"
] | ai202388 | null | null | null | 0 | 0 | Entry not found |
atom-in-the-universe/bild-8a55a885-6632-4ffe-badd-a3812a6bffb6 | 2023-10-03T12:50:43.000Z | [
"region:us"
] | atom-in-the-universe | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_Undi95__MLewd-ReMM-L2-Chat-20B | 2023-10-03T13:02:35.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Undi95/MLewd-ReMM-L2-Chat-20B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Undi95/MLewd-ReMM-L2-Chat-20B](https://huggingface.co/Undi95/MLewd-ReMM-L2-Chat-20B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Undi95__MLewd-ReMM-L2-Chat-20B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-03T13:01:09.823619](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__MLewd-ReMM-L2-Chat-20B/blob/main/results_2023-10-03T13-01-09.823619.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5927393333620284,\n\
\ \"acc_stderr\": 0.03399197195287319,\n \"acc_norm\": 0.5963879963667763,\n\
\ \"acc_norm_stderr\": 0.03396857090690247,\n \"mc1\": 0.3929008567931457,\n\
\ \"mc1_stderr\": 0.017097248285233065,\n \"mc2\": 0.5562951743828177,\n\
\ \"mc2_stderr\": 0.015862974807699288\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5964163822525598,\n \"acc_stderr\": 0.014337158914268447,\n\
\ \"acc_norm\": 0.6245733788395904,\n \"acc_norm_stderr\": 0.014150631435111728\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6690898227444733,\n\
\ \"acc_stderr\": 0.004695791340502876,\n \"acc_norm\": 0.8562039434375622,\n\
\ \"acc_norm_stderr\": 0.0035016571073867085\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244219,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244219\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5037037037037037,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.5037037037037037,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5723684210526315,\n \"acc_stderr\": 0.04026097083296563,\n\
\ \"acc_norm\": 0.5723684210526315,\n \"acc_norm_stderr\": 0.04026097083296563\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6226415094339622,\n \"acc_stderr\": 0.029832808114796005,\n\
\ \"acc_norm\": 0.6226415094339622,\n \"acc_norm_stderr\": 0.029832808114796005\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6458333333333334,\n\
\ \"acc_stderr\": 0.039994111357535424,\n \"acc_norm\": 0.6458333333333334,\n\
\ \"acc_norm_stderr\": 0.039994111357535424\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n\
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5895953757225434,\n\
\ \"acc_stderr\": 0.03750757044895537,\n \"acc_norm\": 0.5895953757225434,\n\
\ \"acc_norm_stderr\": 0.03750757044895537\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201942,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201942\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.48936170212765956,\n \"acc_stderr\": 0.03267862331014063,\n\
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.03267862331014063\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n\
\ \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n\
\ \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.34656084656084657,\n \"acc_stderr\": 0.024508777521028424,\n \"\
acc_norm\": 0.34656084656084657,\n \"acc_norm_stderr\": 0.024508777521028424\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n\
\ \"acc_stderr\": 0.04375888492727061,\n \"acc_norm\": 0.3968253968253968,\n\
\ \"acc_norm_stderr\": 0.04375888492727061\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7064516129032258,\n\
\ \"acc_stderr\": 0.025906087021319295,\n \"acc_norm\": 0.7064516129032258,\n\
\ \"acc_norm_stderr\": 0.025906087021319295\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.03517603540361008,\n\
\ \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.03517603540361008\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\"\
: 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.03287666758603489,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.03287666758603489\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7626262626262627,\n \"acc_stderr\": 0.030313710538198906,\n \"\
acc_norm\": 0.7626262626262627,\n \"acc_norm_stderr\": 0.030313710538198906\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8393782383419689,\n \"acc_stderr\": 0.026499057701397436,\n\
\ \"acc_norm\": 0.8393782383419689,\n \"acc_norm_stderr\": 0.026499057701397436\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5948717948717949,\n \"acc_stderr\": 0.024890471769938145,\n\
\ \"acc_norm\": 0.5948717948717949,\n \"acc_norm_stderr\": 0.024890471769938145\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253252,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253252\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6386554621848739,\n \"acc_stderr\": 0.031204691225150016,\n\
\ \"acc_norm\": 0.6386554621848739,\n \"acc_norm_stderr\": 0.031204691225150016\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719197,\n \"\
acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719197\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7596330275229358,\n \"acc_stderr\": 0.01832060732096407,\n \"\
acc_norm\": 0.7596330275229358,\n \"acc_norm_stderr\": 0.01832060732096407\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.46296296296296297,\n \"acc_stderr\": 0.03400603625538271,\n \"\
acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.03400603625538271\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.803921568627451,\n \"acc_stderr\": 0.027865942286639318,\n \"\
acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639318\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7637130801687764,\n \"acc_stderr\": 0.027652153144159267,\n \
\ \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.027652153144159267\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n\
\ \"acc_stderr\": 0.03076935200822915,\n \"acc_norm\": 0.6995515695067265,\n\
\ \"acc_norm_stderr\": 0.03076935200822915\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7022900763358778,\n \"acc_stderr\": 0.040103589424622034,\n\
\ \"acc_norm\": 0.7022900763358778,\n \"acc_norm_stderr\": 0.040103589424622034\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7272727272727273,\n \"acc_stderr\": 0.04065578140908705,\n \"\
acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04065578140908705\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7055214723926381,\n \"acc_stderr\": 0.03581165790474082,\n\
\ \"acc_norm\": 0.7055214723926381,\n \"acc_norm_stderr\": 0.03581165790474082\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.044532548363264673,\n\
\ \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.044532548363264673\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8461538461538461,\n\
\ \"acc_stderr\": 0.023636873317489294,\n \"acc_norm\": 0.8461538461538461,\n\
\ \"acc_norm_stderr\": 0.023636873317489294\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7586206896551724,\n\
\ \"acc_stderr\": 0.015302380123542115,\n \"acc_norm\": 0.7586206896551724,\n\
\ \"acc_norm_stderr\": 0.015302380123542115\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.025305258131879706,\n\
\ \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.025305258131879706\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5083798882681564,\n\
\ \"acc_stderr\": 0.016720152794672486,\n \"acc_norm\": 0.5083798882681564,\n\
\ \"acc_norm_stderr\": 0.016720152794672486\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6437908496732027,\n \"acc_stderr\": 0.027420477662629242,\n\
\ \"acc_norm\": 0.6437908496732027,\n \"acc_norm_stderr\": 0.027420477662629242\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6784565916398714,\n\
\ \"acc_stderr\": 0.026527724079528872,\n \"acc_norm\": 0.6784565916398714,\n\
\ \"acc_norm_stderr\": 0.026527724079528872\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6728395061728395,\n \"acc_stderr\": 0.026105673861409818,\n\
\ \"acc_norm\": 0.6728395061728395,\n \"acc_norm_stderr\": 0.026105673861409818\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46099290780141844,\n \"acc_stderr\": 0.02973659252642443,\n \
\ \"acc_norm\": 0.46099290780141844,\n \"acc_norm_stderr\": 0.02973659252642443\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46284224250325945,\n\
\ \"acc_stderr\": 0.012734923579532074,\n \"acc_norm\": 0.46284224250325945,\n\
\ \"acc_norm_stderr\": 0.012734923579532074\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5955882352941176,\n \"acc_stderr\": 0.029812630701569743,\n\
\ \"acc_norm\": 0.5955882352941176,\n \"acc_norm_stderr\": 0.029812630701569743\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6176470588235294,\n \"acc_stderr\": 0.01965992249362335,\n \
\ \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.01965992249362335\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.046075820907199756,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.046075820907199756\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6857142857142857,\n \"acc_stderr\": 0.029719329422417475,\n\
\ \"acc_norm\": 0.6857142857142857,\n \"acc_norm_stderr\": 0.029719329422417475\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.746268656716418,\n\
\ \"acc_stderr\": 0.030769444967296018,\n \"acc_norm\": 0.746268656716418,\n\
\ \"acc_norm_stderr\": 0.030769444967296018\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.463855421686747,\n\
\ \"acc_stderr\": 0.03882310850890593,\n \"acc_norm\": 0.463855421686747,\n\
\ \"acc_norm_stderr\": 0.03882310850890593\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7543859649122807,\n \"acc_stderr\": 0.03301405946987249,\n\
\ \"acc_norm\": 0.7543859649122807,\n \"acc_norm_stderr\": 0.03301405946987249\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3929008567931457,\n\
\ \"mc1_stderr\": 0.017097248285233065,\n \"mc2\": 0.5562951743828177,\n\
\ \"mc2_stderr\": 0.015862974807699288\n }\n}\n```"
repo_url: https://huggingface.co/Undi95/MLewd-ReMM-L2-Chat-20B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_03T13_01_09.823619
path:
- '**/details_harness|arc:challenge|25_2023-10-03T13-01-09.823619.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-03T13-01-09.823619.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_03T13_01_09.823619
path:
- '**/details_harness|hellaswag|10_2023-10-03T13-01-09.823619.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-03T13-01-09.823619.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_03T13_01_09.823619
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T13-01-09.823619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T13-01-09.823619.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T13-01-09.823619.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_03T13_01_09.823619
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T13-01-09.823619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T13-01-09.823619.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_03T13_01_09.823619
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T13-01-09.823619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T13-01-09.823619.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_03T13_01_09.823619
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T13-01-09.823619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T13-01-09.823619.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_03T13_01_09.823619
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T13-01-09.823619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T13-01-09.823619.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_03T13_01_09.823619
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T13-01-09.823619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T13-01-09.823619.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_03T13_01_09.823619
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T13-01-09.823619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T13-01-09.823619.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_03T13_01_09.823619
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T13-01-09.823619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T13-01-09.823619.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_03T13_01_09.823619
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T13-01-09.823619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T13-01-09.823619.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_03T13_01_09.823619
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T13-01-09.823619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T13-01-09.823619.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_03T13_01_09.823619
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T13-01-09.823619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T13-01-09.823619.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_03T13_01_09.823619
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T13-01-09.823619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T13-01-09.823619.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_03T13_01_09.823619
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T13-01-09.823619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T13-01-09.823619.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_03T13_01_09.823619
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T13-01-09.823619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T13-01-09.823619.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_03T13_01_09.823619
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T13-01-09.823619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T13-01-09.823619.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_03T13_01_09.823619
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T13-01-09.823619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T13-01-09.823619.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_03T13_01_09.823619
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T13-01-09.823619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T13-01-09.823619.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_03T13_01_09.823619
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T13-01-09.823619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T13-01-09.823619.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_03T13_01_09.823619
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T13-01-09.823619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T13-01-09.823619.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_03T13_01_09.823619
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T13-01-09.823619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T13-01-09.823619.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_03T13_01_09.823619
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T13-01-09.823619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T13-01-09.823619.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_03T13_01_09.823619
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T13-01-09.823619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T13-01-09.823619.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_03T13_01_09.823619
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T13-01-09.823619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T13-01-09.823619.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_03T13_01_09.823619
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T13-01-09.823619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T13-01-09.823619.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_03T13_01_09.823619
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T13-01-09.823619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T13-01-09.823619.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_03T13_01_09.823619
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T13-01-09.823619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T13-01-09.823619.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_03T13_01_09.823619
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T13-01-09.823619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T13-01-09.823619.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_03T13_01_09.823619
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T13-01-09.823619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T13-01-09.823619.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_03T13_01_09.823619
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T13-01-09.823619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T13-01-09.823619.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_03T13_01_09.823619
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T13-01-09.823619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T13-01-09.823619.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_03T13_01_09.823619
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T13-01-09.823619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T13-01-09.823619.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_03T13_01_09.823619
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T13-01-09.823619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T13-01-09.823619.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_03T13_01_09.823619
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T13-01-09.823619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T13-01-09.823619.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_03T13_01_09.823619
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T13-01-09.823619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T13-01-09.823619.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_03T13_01_09.823619
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T13-01-09.823619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T13-01-09.823619.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_03T13_01_09.823619
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T13-01-09.823619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T13-01-09.823619.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_03T13_01_09.823619
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T13-01-09.823619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T13-01-09.823619.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_03T13_01_09.823619
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T13-01-09.823619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T13-01-09.823619.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_03T13_01_09.823619
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T13-01-09.823619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T13-01-09.823619.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_03T13_01_09.823619
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T13-01-09.823619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T13-01-09.823619.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_03T13_01_09.823619
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T13-01-09.823619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T13-01-09.823619.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_03T13_01_09.823619
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T13-01-09.823619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T13-01-09.823619.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_03T13_01_09.823619
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T13-01-09.823619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T13-01-09.823619.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_03T13_01_09.823619
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T13-01-09.823619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T13-01-09.823619.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_03T13_01_09.823619
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T13-01-09.823619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T13-01-09.823619.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_03T13_01_09.823619
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T13-01-09.823619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T13-01-09.823619.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_03T13_01_09.823619
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T13-01-09.823619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T13-01-09.823619.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_03T13_01_09.823619
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T13-01-09.823619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T13-01-09.823619.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_03T13_01_09.823619
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T13-01-09.823619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T13-01-09.823619.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_03T13_01_09.823619
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T13-01-09.823619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T13-01-09.823619.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_03T13_01_09.823619
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T13-01-09.823619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T13-01-09.823619.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_03T13_01_09.823619
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T13-01-09.823619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T13-01-09.823619.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_03T13_01_09.823619
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T13-01-09.823619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T13-01-09.823619.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_03T13_01_09.823619
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T13-01-09.823619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T13-01-09.823619.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_03T13_01_09.823619
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T13-01-09.823619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T13-01-09.823619.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_03T13_01_09.823619
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T13-01-09.823619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T13-01-09.823619.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_03T13_01_09.823619
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T13-01-09.823619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T13-01-09.823619.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_03T13_01_09.823619
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T13-01-09.823619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T13-01-09.823619.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_03T13_01_09.823619
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T13-01-09.823619.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T13-01-09.823619.parquet'
- config_name: results
data_files:
- split: 2023_10_03T13_01_09.823619
path:
- results_2023-10-03T13-01-09.823619.parquet
- split: latest
path:
- results_2023-10-03T13-01-09.823619.parquet
---
# Dataset Card for Evaluation run of Undi95/MLewd-ReMM-L2-Chat-20B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Undi95/MLewd-ReMM-L2-Chat-20B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Undi95/MLewd-ReMM-L2-Chat-20B](https://huggingface.co/Undi95/MLewd-ReMM-L2-Chat-20B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Undi95__MLewd-ReMM-L2-Chat-20B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-03T13:01:09.823619](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__MLewd-ReMM-L2-Chat-20B/blob/main/results_2023-10-03T13-01-09.823619.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5927393333620284,
"acc_stderr": 0.03399197195287319,
"acc_norm": 0.5963879963667763,
"acc_norm_stderr": 0.03396857090690247,
"mc1": 0.3929008567931457,
"mc1_stderr": 0.017097248285233065,
"mc2": 0.5562951743828177,
"mc2_stderr": 0.015862974807699288
},
"harness|arc:challenge|25": {
"acc": 0.5964163822525598,
"acc_stderr": 0.014337158914268447,
"acc_norm": 0.6245733788395904,
"acc_norm_stderr": 0.014150631435111728
},
"harness|hellaswag|10": {
"acc": 0.6690898227444733,
"acc_stderr": 0.004695791340502876,
"acc_norm": 0.8562039434375622,
"acc_norm_stderr": 0.0035016571073867085
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244219,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244219
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5037037037037037,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.5037037037037037,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5723684210526315,
"acc_stderr": 0.04026097083296563,
"acc_norm": 0.5723684210526315,
"acc_norm_stderr": 0.04026097083296563
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6226415094339622,
"acc_stderr": 0.029832808114796005,
"acc_norm": 0.6226415094339622,
"acc_norm_stderr": 0.029832808114796005
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6458333333333334,
"acc_stderr": 0.039994111357535424,
"acc_norm": 0.6458333333333334,
"acc_norm_stderr": 0.039994111357535424
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5895953757225434,
"acc_stderr": 0.03750757044895537,
"acc_norm": 0.5895953757225434,
"acc_norm_stderr": 0.03750757044895537
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04690650298201942,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04690650298201942
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.03267862331014063,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.03267862331014063
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022056,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022056
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.34656084656084657,
"acc_stderr": 0.024508777521028424,
"acc_norm": 0.34656084656084657,
"acc_norm_stderr": 0.024508777521028424
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.04375888492727061,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.04375888492727061
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7064516129032258,
"acc_stderr": 0.025906087021319295,
"acc_norm": 0.7064516129032258,
"acc_norm_stderr": 0.025906087021319295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.03517603540361008,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.03517603540361008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.03287666758603489,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.03287666758603489
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7626262626262627,
"acc_stderr": 0.030313710538198906,
"acc_norm": 0.7626262626262627,
"acc_norm_stderr": 0.030313710538198906
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8393782383419689,
"acc_stderr": 0.026499057701397436,
"acc_norm": 0.8393782383419689,
"acc_norm_stderr": 0.026499057701397436
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5948717948717949,
"acc_stderr": 0.024890471769938145,
"acc_norm": 0.5948717948717949,
"acc_norm_stderr": 0.024890471769938145
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253252,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253252
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6386554621848739,
"acc_stderr": 0.031204691225150016,
"acc_norm": 0.6386554621848739,
"acc_norm_stderr": 0.031204691225150016
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3841059602649007,
"acc_stderr": 0.03971301814719197,
"acc_norm": 0.3841059602649007,
"acc_norm_stderr": 0.03971301814719197
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7596330275229358,
"acc_stderr": 0.01832060732096407,
"acc_norm": 0.7596330275229358,
"acc_norm_stderr": 0.01832060732096407
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.03400603625538271,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.03400603625538271
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.027865942286639318,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.027865942286639318
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7637130801687764,
"acc_stderr": 0.027652153144159267,
"acc_norm": 0.7637130801687764,
"acc_norm_stderr": 0.027652153144159267
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.03076935200822915,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.03076935200822915
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7022900763358778,
"acc_stderr": 0.040103589424622034,
"acc_norm": 0.7022900763358778,
"acc_norm_stderr": 0.040103589424622034
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04065578140908705,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04065578140908705
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7055214723926381,
"acc_stderr": 0.03581165790474082,
"acc_norm": 0.7055214723926381,
"acc_norm_stderr": 0.03581165790474082
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.7184466019417476,
"acc_stderr": 0.044532548363264673,
"acc_norm": 0.7184466019417476,
"acc_norm_stderr": 0.044532548363264673
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8461538461538461,
"acc_stderr": 0.023636873317489294,
"acc_norm": 0.8461538461538461,
"acc_norm_stderr": 0.023636873317489294
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7586206896551724,
"acc_stderr": 0.015302380123542115,
"acc_norm": 0.7586206896551724,
"acc_norm_stderr": 0.015302380123542115
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.025305258131879706,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.025305258131879706
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5083798882681564,
"acc_stderr": 0.016720152794672486,
"acc_norm": 0.5083798882681564,
"acc_norm_stderr": 0.016720152794672486
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6437908496732027,
"acc_stderr": 0.027420477662629242,
"acc_norm": 0.6437908496732027,
"acc_norm_stderr": 0.027420477662629242
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6784565916398714,
"acc_stderr": 0.026527724079528872,
"acc_norm": 0.6784565916398714,
"acc_norm_stderr": 0.026527724079528872
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6728395061728395,
"acc_stderr": 0.026105673861409818,
"acc_norm": 0.6728395061728395,
"acc_norm_stderr": 0.026105673861409818
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46099290780141844,
"acc_stderr": 0.02973659252642443,
"acc_norm": 0.46099290780141844,
"acc_norm_stderr": 0.02973659252642443
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46284224250325945,
"acc_stderr": 0.012734923579532074,
"acc_norm": 0.46284224250325945,
"acc_norm_stderr": 0.012734923579532074
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5955882352941176,
"acc_stderr": 0.029812630701569743,
"acc_norm": 0.5955882352941176,
"acc_norm_stderr": 0.029812630701569743
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.01965992249362335,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.01965992249362335
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.046075820907199756,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.046075820907199756
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6857142857142857,
"acc_stderr": 0.029719329422417475,
"acc_norm": 0.6857142857142857,
"acc_norm_stderr": 0.029719329422417475
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.746268656716418,
"acc_stderr": 0.030769444967296018,
"acc_norm": 0.746268656716418,
"acc_norm_stderr": 0.030769444967296018
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-virology|5": {
"acc": 0.463855421686747,
"acc_stderr": 0.03882310850890593,
"acc_norm": 0.463855421686747,
"acc_norm_stderr": 0.03882310850890593
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7543859649122807,
"acc_stderr": 0.03301405946987249,
"acc_norm": 0.7543859649122807,
"acc_norm_stderr": 0.03301405946987249
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3929008567931457,
"mc1_stderr": 0.017097248285233065,
"mc2": 0.5562951743828177,
"mc2_stderr": 0.015862974807699288
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
yavasde/lemmatized-wikitext2 | 2023-10-03T13:16:22.000Z | [
"region:us"
] | yavasde | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 2652445
num_examples: 23767
- name: test
num_bytes: 313242
num_examples: 2891
- name: valid
num_bytes: 284363
num_examples: 2461
download_size: 1949711
dataset_size: 3250050
---
# Dataset Card for "lemmatized-wikitext"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_CobraMamba__mamba-gpt-7b | 2023-10-03T13:23:45.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of CobraMamba/mamba-gpt-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CobraMamba/mamba-gpt-7b](https://huggingface.co/CobraMamba/mamba-gpt-7b) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CobraMamba__mamba-gpt-7b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-03T13:22:21.722990](https://huggingface.co/datasets/open-llm-leaderboard/details_CobraMamba__mamba-gpt-7b/blob/main/results_2023-10-03T13-22-21.722990.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.476038892057817,\n\
\ \"acc_stderr\": 0.03525775152086002,\n \"acc_norm\": 0.4800761410842263,\n\
\ \"acc_norm_stderr\": 0.03524704421851352,\n \"mc1\": 0.2827417380660955,\n\
\ \"mc1_stderr\": 0.015764770836777308,\n \"mc2\": 0.42059760728665,\n\
\ \"mc2_stderr\": 0.014423576804670638\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4684300341296928,\n \"acc_stderr\": 0.014582236460866977,\n\
\ \"acc_norm\": 0.5119453924914675,\n \"acc_norm_stderr\": 0.014607220340597167\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5593507269468233,\n\
\ \"acc_stderr\": 0.004954503606471607,\n \"acc_norm\": 0.7540330611431986,\n\
\ \"acc_norm_stderr\": 0.004297788888297731\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421296,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421296\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4888888888888889,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.4888888888888889,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4934210526315789,\n \"acc_stderr\": 0.040685900502249704,\n\
\ \"acc_norm\": 0.4934210526315789,\n \"acc_norm_stderr\": 0.040685900502249704\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.47,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4716981132075472,\n \"acc_stderr\": 0.030723535249006107,\n\
\ \"acc_norm\": 0.4716981132075472,\n \"acc_norm_stderr\": 0.030723535249006107\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4861111111111111,\n\
\ \"acc_stderr\": 0.041795966175810016,\n \"acc_norm\": 0.4861111111111111,\n\
\ \"acc_norm_stderr\": 0.041795966175810016\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\"\
: 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.42196531791907516,\n\
\ \"acc_stderr\": 0.03765746693865151,\n \"acc_norm\": 0.42196531791907516,\n\
\ \"acc_norm_stderr\": 0.03765746693865151\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.04488482852329017,\n\
\ \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.04488482852329017\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n\
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3574468085106383,\n \"acc_stderr\": 0.03132941789476425,\n\
\ \"acc_norm\": 0.3574468085106383,\n \"acc_norm_stderr\": 0.03132941789476425\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.34210526315789475,\n\
\ \"acc_stderr\": 0.04462917535336936,\n \"acc_norm\": 0.34210526315789475,\n\
\ \"acc_norm_stderr\": 0.04462917535336936\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.45517241379310347,\n \"acc_stderr\": 0.04149886942192118,\n\
\ \"acc_norm\": 0.45517241379310347,\n \"acc_norm_stderr\": 0.04149886942192118\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2830687830687831,\n \"acc_stderr\": 0.023201392938194978,\n \"\
acc_norm\": 0.2830687830687831,\n \"acc_norm_stderr\": 0.023201392938194978\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2698412698412698,\n\
\ \"acc_stderr\": 0.03970158273235172,\n \"acc_norm\": 0.2698412698412698,\n\
\ \"acc_norm_stderr\": 0.03970158273235172\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.5258064516129032,\n \"acc_stderr\": 0.02840609505765332,\n \"\
acc_norm\": 0.5258064516129032,\n \"acc_norm_stderr\": 0.02840609505765332\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.33497536945812806,\n \"acc_stderr\": 0.033208527423483104,\n \"\
acc_norm\": 0.33497536945812806,\n \"acc_norm_stderr\": 0.033208527423483104\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6484848484848484,\n \"acc_stderr\": 0.037282069986826503,\n\
\ \"acc_norm\": 0.6484848484848484,\n \"acc_norm_stderr\": 0.037282069986826503\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5909090909090909,\n \"acc_stderr\": 0.03502975799413007,\n \"\
acc_norm\": 0.5909090909090909,\n \"acc_norm_stderr\": 0.03502975799413007\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6269430051813472,\n \"acc_stderr\": 0.03490205592048573,\n\
\ \"acc_norm\": 0.6269430051813472,\n \"acc_norm_stderr\": 0.03490205592048573\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4282051282051282,\n \"acc_stderr\": 0.025088301454694834,\n\
\ \"acc_norm\": 0.4282051282051282,\n \"acc_norm_stderr\": 0.025088301454694834\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.28888888888888886,\n \"acc_stderr\": 0.027634907264178544,\n \
\ \"acc_norm\": 0.28888888888888886,\n \"acc_norm_stderr\": 0.027634907264178544\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.4327731092436975,\n \"acc_stderr\": 0.03218358107742613,\n \
\ \"acc_norm\": 0.4327731092436975,\n \"acc_norm_stderr\": 0.03218358107742613\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2582781456953642,\n \"acc_stderr\": 0.035737053147634576,\n \"\
acc_norm\": 0.2582781456953642,\n \"acc_norm_stderr\": 0.035737053147634576\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6110091743119266,\n \"acc_stderr\": 0.020902300887392873,\n \"\
acc_norm\": 0.6110091743119266,\n \"acc_norm_stderr\": 0.020902300887392873\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.37037037037037035,\n \"acc_stderr\": 0.03293377139415191,\n \"\
acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.03293377139415191\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6176470588235294,\n \"acc_stderr\": 0.0341078533890472,\n \"acc_norm\"\
: 0.6176470588235294,\n \"acc_norm_stderr\": 0.0341078533890472\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.6329113924050633,\n \"acc_stderr\": 0.031376240725616185,\n \"\
acc_norm\": 0.6329113924050633,\n \"acc_norm_stderr\": 0.031376240725616185\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5739910313901345,\n\
\ \"acc_stderr\": 0.033188332862172806,\n \"acc_norm\": 0.5739910313901345,\n\
\ \"acc_norm_stderr\": 0.033188332862172806\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5419847328244275,\n \"acc_stderr\": 0.04369802690578756,\n\
\ \"acc_norm\": 0.5419847328244275,\n \"acc_norm_stderr\": 0.04369802690578756\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6198347107438017,\n \"acc_stderr\": 0.04431324501968431,\n \"\
acc_norm\": 0.6198347107438017,\n \"acc_norm_stderr\": 0.04431324501968431\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5370370370370371,\n\
\ \"acc_stderr\": 0.04820403072760628,\n \"acc_norm\": 0.5370370370370371,\n\
\ \"acc_norm_stderr\": 0.04820403072760628\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5030674846625767,\n \"acc_stderr\": 0.03928297078179663,\n\
\ \"acc_norm\": 0.5030674846625767,\n \"acc_norm_stderr\": 0.03928297078179663\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3392857142857143,\n\
\ \"acc_stderr\": 0.04493949068613539,\n \"acc_norm\": 0.3392857142857143,\n\
\ \"acc_norm_stderr\": 0.04493949068613539\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7087378640776699,\n \"acc_stderr\": 0.04498676320572924,\n\
\ \"acc_norm\": 0.7087378640776699,\n \"acc_norm_stderr\": 0.04498676320572924\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7435897435897436,\n\
\ \"acc_stderr\": 0.028605953702004264,\n \"acc_norm\": 0.7435897435897436,\n\
\ \"acc_norm_stderr\": 0.028605953702004264\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6679438058748404,\n\
\ \"acc_stderr\": 0.016841174655295724,\n \"acc_norm\": 0.6679438058748404,\n\
\ \"acc_norm_stderr\": 0.016841174655295724\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5144508670520231,\n \"acc_stderr\": 0.026907849856282542,\n\
\ \"acc_norm\": 0.5144508670520231,\n \"acc_norm_stderr\": 0.026907849856282542\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n\
\ \"acc_stderr\": 0.014422292204808847,\n \"acc_norm\": 0.24692737430167597,\n\
\ \"acc_norm_stderr\": 0.014422292204808847\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.565359477124183,\n \"acc_stderr\": 0.028384256704883034,\n\
\ \"acc_norm\": 0.565359477124183,\n \"acc_norm_stderr\": 0.028384256704883034\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5369774919614148,\n\
\ \"acc_stderr\": 0.028320325830105915,\n \"acc_norm\": 0.5369774919614148,\n\
\ \"acc_norm_stderr\": 0.028320325830105915\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5216049382716049,\n \"acc_stderr\": 0.02779476010500873,\n\
\ \"acc_norm\": 0.5216049382716049,\n \"acc_norm_stderr\": 0.02779476010500873\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.36879432624113473,\n \"acc_stderr\": 0.028782227561347237,\n \
\ \"acc_norm\": 0.36879432624113473,\n \"acc_norm_stderr\": 0.028782227561347237\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3683181225554107,\n\
\ \"acc_stderr\": 0.012319403369564642,\n \"acc_norm\": 0.3683181225554107,\n\
\ \"acc_norm_stderr\": 0.012319403369564642\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.39705882352941174,\n \"acc_stderr\": 0.02972215209928007,\n\
\ \"acc_norm\": 0.39705882352941174,\n \"acc_norm_stderr\": 0.02972215209928007\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.47549019607843135,\n \"acc_stderr\": 0.020203517280261447,\n \
\ \"acc_norm\": 0.47549019607843135,\n \"acc_norm_stderr\": 0.020203517280261447\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5181818181818182,\n\
\ \"acc_stderr\": 0.04785964010794915,\n \"acc_norm\": 0.5181818181818182,\n\
\ \"acc_norm_stderr\": 0.04785964010794915\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03136250240935892,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03136250240935892\n },\n\
\ \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5323383084577115,\n\
\ \"acc_stderr\": 0.035281314729336065,\n \"acc_norm\": 0.5323383084577115,\n\
\ \"acc_norm_stderr\": 0.035281314729336065\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.40963855421686746,\n\
\ \"acc_stderr\": 0.03828401115079023,\n \"acc_norm\": 0.40963855421686746,\n\
\ \"acc_norm_stderr\": 0.03828401115079023\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.036155076303109365,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.036155076303109365\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2827417380660955,\n\
\ \"mc1_stderr\": 0.015764770836777308,\n \"mc2\": 0.42059760728665,\n\
\ \"mc2_stderr\": 0.014423576804670638\n }\n}\n```"
repo_url: https://huggingface.co/CobraMamba/mamba-gpt-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_03T13_22_21.722990
path:
- '**/details_harness|arc:challenge|25_2023-10-03T13-22-21.722990.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-03T13-22-21.722990.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_03T13_22_21.722990
path:
- '**/details_harness|hellaswag|10_2023-10-03T13-22-21.722990.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-03T13-22-21.722990.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_03T13_22_21.722990
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T13-22-21.722990.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T13-22-21.722990.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T13-22-21.722990.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_03T13_22_21.722990
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T13-22-21.722990.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T13-22-21.722990.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_03T13_22_21.722990
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T13-22-21.722990.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T13-22-21.722990.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_03T13_22_21.722990
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T13-22-21.722990.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T13-22-21.722990.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_03T13_22_21.722990
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T13-22-21.722990.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T13-22-21.722990.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_03T13_22_21.722990
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T13-22-21.722990.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T13-22-21.722990.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_03T13_22_21.722990
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T13-22-21.722990.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T13-22-21.722990.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_03T13_22_21.722990
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T13-22-21.722990.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T13-22-21.722990.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_03T13_22_21.722990
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T13-22-21.722990.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T13-22-21.722990.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_03T13_22_21.722990
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T13-22-21.722990.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T13-22-21.722990.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_03T13_22_21.722990
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T13-22-21.722990.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T13-22-21.722990.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_03T13_22_21.722990
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T13-22-21.722990.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T13-22-21.722990.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_03T13_22_21.722990
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T13-22-21.722990.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T13-22-21.722990.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_03T13_22_21.722990
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T13-22-21.722990.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T13-22-21.722990.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_03T13_22_21.722990
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T13-22-21.722990.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T13-22-21.722990.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_03T13_22_21.722990
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T13-22-21.722990.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T13-22-21.722990.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_03T13_22_21.722990
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T13-22-21.722990.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T13-22-21.722990.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_03T13_22_21.722990
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T13-22-21.722990.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T13-22-21.722990.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_03T13_22_21.722990
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T13-22-21.722990.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T13-22-21.722990.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_03T13_22_21.722990
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T13-22-21.722990.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T13-22-21.722990.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_03T13_22_21.722990
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T13-22-21.722990.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T13-22-21.722990.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_03T13_22_21.722990
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T13-22-21.722990.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T13-22-21.722990.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_03T13_22_21.722990
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T13-22-21.722990.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T13-22-21.722990.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_03T13_22_21.722990
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T13-22-21.722990.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T13-22-21.722990.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_03T13_22_21.722990
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T13-22-21.722990.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T13-22-21.722990.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_03T13_22_21.722990
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T13-22-21.722990.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T13-22-21.722990.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_03T13_22_21.722990
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T13-22-21.722990.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T13-22-21.722990.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_03T13_22_21.722990
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T13-22-21.722990.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T13-22-21.722990.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_03T13_22_21.722990
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T13-22-21.722990.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T13-22-21.722990.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_03T13_22_21.722990
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T13-22-21.722990.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T13-22-21.722990.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_03T13_22_21.722990
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T13-22-21.722990.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T13-22-21.722990.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_03T13_22_21.722990
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T13-22-21.722990.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T13-22-21.722990.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_03T13_22_21.722990
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T13-22-21.722990.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T13-22-21.722990.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_03T13_22_21.722990
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T13-22-21.722990.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T13-22-21.722990.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_03T13_22_21.722990
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T13-22-21.722990.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T13-22-21.722990.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_03T13_22_21.722990
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T13-22-21.722990.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T13-22-21.722990.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_03T13_22_21.722990
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T13-22-21.722990.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T13-22-21.722990.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_03T13_22_21.722990
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T13-22-21.722990.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T13-22-21.722990.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_03T13_22_21.722990
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T13-22-21.722990.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T13-22-21.722990.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_03T13_22_21.722990
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T13-22-21.722990.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T13-22-21.722990.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_03T13_22_21.722990
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T13-22-21.722990.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T13-22-21.722990.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_03T13_22_21.722990
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T13-22-21.722990.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T13-22-21.722990.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_03T13_22_21.722990
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T13-22-21.722990.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T13-22-21.722990.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_03T13_22_21.722990
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T13-22-21.722990.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T13-22-21.722990.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_03T13_22_21.722990
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T13-22-21.722990.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T13-22-21.722990.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_03T13_22_21.722990
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T13-22-21.722990.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T13-22-21.722990.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_03T13_22_21.722990
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T13-22-21.722990.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T13-22-21.722990.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_03T13_22_21.722990
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T13-22-21.722990.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T13-22-21.722990.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_03T13_22_21.722990
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T13-22-21.722990.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T13-22-21.722990.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_03T13_22_21.722990
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T13-22-21.722990.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T13-22-21.722990.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_03T13_22_21.722990
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T13-22-21.722990.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T13-22-21.722990.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_03T13_22_21.722990
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T13-22-21.722990.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T13-22-21.722990.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_03T13_22_21.722990
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T13-22-21.722990.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T13-22-21.722990.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_03T13_22_21.722990
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T13-22-21.722990.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T13-22-21.722990.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_03T13_22_21.722990
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T13-22-21.722990.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T13-22-21.722990.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_03T13_22_21.722990
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T13-22-21.722990.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T13-22-21.722990.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_03T13_22_21.722990
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T13-22-21.722990.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T13-22-21.722990.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_03T13_22_21.722990
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T13-22-21.722990.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T13-22-21.722990.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_03T13_22_21.722990
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T13-22-21.722990.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T13-22-21.722990.parquet'
- config_name: results
data_files:
- split: 2023_10_03T13_22_21.722990
path:
- results_2023-10-03T13-22-21.722990.parquet
- split: latest
path:
- results_2023-10-03T13-22-21.722990.parquet
---
# Dataset Card for Evaluation run of CobraMamba/mamba-gpt-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/CobraMamba/mamba-gpt-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [CobraMamba/mamba-gpt-7b](https://huggingface.co/CobraMamba/mamba-gpt-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CobraMamba__mamba-gpt-7b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-03T13:22:21.722990](https://huggingface.co/datasets/open-llm-leaderboard/details_CobraMamba__mamba-gpt-7b/blob/main/results_2023-10-03T13-22-21.722990.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.476038892057817,
"acc_stderr": 0.03525775152086002,
"acc_norm": 0.4800761410842263,
"acc_norm_stderr": 0.03524704421851352,
"mc1": 0.2827417380660955,
"mc1_stderr": 0.015764770836777308,
"mc2": 0.42059760728665,
"mc2_stderr": 0.014423576804670638
},
"harness|arc:challenge|25": {
"acc": 0.4684300341296928,
"acc_stderr": 0.014582236460866977,
"acc_norm": 0.5119453924914675,
"acc_norm_stderr": 0.014607220340597167
},
"harness|hellaswag|10": {
"acc": 0.5593507269468233,
"acc_stderr": 0.004954503606471607,
"acc_norm": 0.7540330611431986,
"acc_norm_stderr": 0.004297788888297731
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421296,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421296
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4888888888888889,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.4888888888888889,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4934210526315789,
"acc_stderr": 0.040685900502249704,
"acc_norm": 0.4934210526315789,
"acc_norm_stderr": 0.040685900502249704
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4716981132075472,
"acc_stderr": 0.030723535249006107,
"acc_norm": 0.4716981132075472,
"acc_norm_stderr": 0.030723535249006107
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4861111111111111,
"acc_stderr": 0.041795966175810016,
"acc_norm": 0.4861111111111111,
"acc_norm_stderr": 0.041795966175810016
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.42196531791907516,
"acc_stderr": 0.03765746693865151,
"acc_norm": 0.42196531791907516,
"acc_norm_stderr": 0.03765746693865151
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.04488482852329017,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.04488482852329017
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3574468085106383,
"acc_stderr": 0.03132941789476425,
"acc_norm": 0.3574468085106383,
"acc_norm_stderr": 0.03132941789476425
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.34210526315789475,
"acc_stderr": 0.04462917535336936,
"acc_norm": 0.34210526315789475,
"acc_norm_stderr": 0.04462917535336936
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.45517241379310347,
"acc_stderr": 0.04149886942192118,
"acc_norm": 0.45517241379310347,
"acc_norm_stderr": 0.04149886942192118
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2830687830687831,
"acc_stderr": 0.023201392938194978,
"acc_norm": 0.2830687830687831,
"acc_norm_stderr": 0.023201392938194978
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2698412698412698,
"acc_stderr": 0.03970158273235172,
"acc_norm": 0.2698412698412698,
"acc_norm_stderr": 0.03970158273235172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5258064516129032,
"acc_stderr": 0.02840609505765332,
"acc_norm": 0.5258064516129032,
"acc_norm_stderr": 0.02840609505765332
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.33497536945812806,
"acc_stderr": 0.033208527423483104,
"acc_norm": 0.33497536945812806,
"acc_norm_stderr": 0.033208527423483104
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6484848484848484,
"acc_stderr": 0.037282069986826503,
"acc_norm": 0.6484848484848484,
"acc_norm_stderr": 0.037282069986826503
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5909090909090909,
"acc_stderr": 0.03502975799413007,
"acc_norm": 0.5909090909090909,
"acc_norm_stderr": 0.03502975799413007
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6269430051813472,
"acc_stderr": 0.03490205592048573,
"acc_norm": 0.6269430051813472,
"acc_norm_stderr": 0.03490205592048573
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4282051282051282,
"acc_stderr": 0.025088301454694834,
"acc_norm": 0.4282051282051282,
"acc_norm_stderr": 0.025088301454694834
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.28888888888888886,
"acc_stderr": 0.027634907264178544,
"acc_norm": 0.28888888888888886,
"acc_norm_stderr": 0.027634907264178544
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4327731092436975,
"acc_stderr": 0.03218358107742613,
"acc_norm": 0.4327731092436975,
"acc_norm_stderr": 0.03218358107742613
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2582781456953642,
"acc_stderr": 0.035737053147634576,
"acc_norm": 0.2582781456953642,
"acc_norm_stderr": 0.035737053147634576
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6110091743119266,
"acc_stderr": 0.020902300887392873,
"acc_norm": 0.6110091743119266,
"acc_norm_stderr": 0.020902300887392873
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.03293377139415191,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.03293377139415191
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.0341078533890472,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.0341078533890472
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6329113924050633,
"acc_stderr": 0.031376240725616185,
"acc_norm": 0.6329113924050633,
"acc_norm_stderr": 0.031376240725616185
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5739910313901345,
"acc_stderr": 0.033188332862172806,
"acc_norm": 0.5739910313901345,
"acc_norm_stderr": 0.033188332862172806
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5419847328244275,
"acc_stderr": 0.04369802690578756,
"acc_norm": 0.5419847328244275,
"acc_norm_stderr": 0.04369802690578756
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6198347107438017,
"acc_stderr": 0.04431324501968431,
"acc_norm": 0.6198347107438017,
"acc_norm_stderr": 0.04431324501968431
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.04820403072760628,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.04820403072760628
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5030674846625767,
"acc_stderr": 0.03928297078179663,
"acc_norm": 0.5030674846625767,
"acc_norm_stderr": 0.03928297078179663
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3392857142857143,
"acc_stderr": 0.04493949068613539,
"acc_norm": 0.3392857142857143,
"acc_norm_stderr": 0.04493949068613539
},
"harness|hendrycksTest-management|5": {
"acc": 0.7087378640776699,
"acc_stderr": 0.04498676320572924,
"acc_norm": 0.7087378640776699,
"acc_norm_stderr": 0.04498676320572924
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7435897435897436,
"acc_stderr": 0.028605953702004264,
"acc_norm": 0.7435897435897436,
"acc_norm_stderr": 0.028605953702004264
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6679438058748404,
"acc_stderr": 0.016841174655295724,
"acc_norm": 0.6679438058748404,
"acc_norm_stderr": 0.016841174655295724
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5144508670520231,
"acc_stderr": 0.026907849856282542,
"acc_norm": 0.5144508670520231,
"acc_norm_stderr": 0.026907849856282542
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808847,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808847
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.565359477124183,
"acc_stderr": 0.028384256704883034,
"acc_norm": 0.565359477124183,
"acc_norm_stderr": 0.028384256704883034
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5369774919614148,
"acc_stderr": 0.028320325830105915,
"acc_norm": 0.5369774919614148,
"acc_norm_stderr": 0.028320325830105915
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5216049382716049,
"acc_stderr": 0.02779476010500873,
"acc_norm": 0.5216049382716049,
"acc_norm_stderr": 0.02779476010500873
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.36879432624113473,
"acc_stderr": 0.028782227561347237,
"acc_norm": 0.36879432624113473,
"acc_norm_stderr": 0.028782227561347237
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3683181225554107,
"acc_stderr": 0.012319403369564642,
"acc_norm": 0.3683181225554107,
"acc_norm_stderr": 0.012319403369564642
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.39705882352941174,
"acc_stderr": 0.02972215209928007,
"acc_norm": 0.39705882352941174,
"acc_norm_stderr": 0.02972215209928007
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.47549019607843135,
"acc_stderr": 0.020203517280261447,
"acc_norm": 0.47549019607843135,
"acc_norm_stderr": 0.020203517280261447
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5181818181818182,
"acc_stderr": 0.04785964010794915,
"acc_norm": 0.5181818181818182,
"acc_norm_stderr": 0.04785964010794915
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6,
"acc_stderr": 0.03136250240935892,
"acc_norm": 0.6,
"acc_norm_stderr": 0.03136250240935892
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5323383084577115,
"acc_stderr": 0.035281314729336065,
"acc_norm": 0.5323383084577115,
"acc_norm_stderr": 0.035281314729336065
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-virology|5": {
"acc": 0.40963855421686746,
"acc_stderr": 0.03828401115079023,
"acc_norm": 0.40963855421686746,
"acc_norm_stderr": 0.03828401115079023
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.036155076303109365,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.036155076303109365
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2827417380660955,
"mc1_stderr": 0.015764770836777308,
"mc2": 0.42059760728665,
"mc2_stderr": 0.014423576804670638
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Eu001/Primo | 2023-10-03T15:06:49.000Z | [
"license:openrail",
"region:us"
] | Eu001 | null | null | null | 0 | 0 | ---
license: openrail
---
|
atom-in-the-universe/bild-1d25814d-8202-4a62-bdf2-f3998694ab95 | 2023-10-03T13:31:09.000Z | [
"region:us"
] | atom-in-the-universe | null | null | null | 0 | 0 | Entry not found |
JoshRedmondUK/LatamSat | 2023-10-03T13:32:14.000Z | [
"license:cc-by-3.0",
"region:us"
] | JoshRedmondUK | null | null | null | 0 | 0 | ---
license: cc-by-3.0
---
|
AdrianaCasadei/CornGrain | 2023-10-03T14:03:33.000Z | [
"license:other",
"region:us"
] | AdrianaCasadei | null | null | null | 0 | 0 | ---
license: other
license_name: corngrain
license_link: LICENSE
---
|
open-llm-leaderboard/details_Xilabs__calypso-3b-alpha-v2 | 2023-10-03T14:03:03.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Xilabs/calypso-3b-alpha-v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Xilabs/calypso-3b-alpha-v2](https://huggingface.co/Xilabs/calypso-3b-alpha-v2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Xilabs__calypso-3b-alpha-v2\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-03T14:01:45.504923](https://huggingface.co/datasets/open-llm-leaderboard/details_Xilabs__calypso-3b-alpha-v2/blob/main/results_2023-10-03T14-01-45.504923.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2649272647272492,\n\
\ \"acc_stderr\": 0.031801100956562335,\n \"acc_norm\": 0.26857734769514896,\n\
\ \"acc_norm_stderr\": 0.03179637444357793,\n \"mc1\": 0.22766217870257038,\n\
\ \"mc1_stderr\": 0.014679255032111068,\n \"mc2\": 0.35726869770535075,\n\
\ \"mc2_stderr\": 0.013773730670830016\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.38310580204778155,\n \"acc_stderr\": 0.014206472661672884,\n\
\ \"acc_norm\": 0.41552901023890787,\n \"acc_norm_stderr\": 0.014401366641216384\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5318661621190998,\n\
\ \"acc_stderr\": 0.004979637330230314,\n \"acc_norm\": 0.714797849034057,\n\
\ \"acc_norm_stderr\": 0.004505879084606856\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768081,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768081\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2962962962962963,\n\
\ \"acc_stderr\": 0.03944624162501116,\n \"acc_norm\": 0.2962962962962963,\n\
\ \"acc_norm_stderr\": 0.03944624162501116\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.035834961763610625,\n\
\ \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.035834961763610625\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.26,\n\
\ \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.26,\n \
\ \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.27169811320754716,\n \"acc_stderr\": 0.027377706624670713,\n\
\ \"acc_norm\": 0.27169811320754716,\n \"acc_norm_stderr\": 0.027377706624670713\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.1597222222222222,\n\
\ \"acc_stderr\": 0.030635578972093274,\n \"acc_norm\": 0.1597222222222222,\n\
\ \"acc_norm_stderr\": 0.030635578972093274\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \"acc_norm\"\
: 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.18497109826589594,\n\
\ \"acc_stderr\": 0.02960562398177123,\n \"acc_norm\": 0.18497109826589594,\n\
\ \"acc_norm_stderr\": 0.02960562398177123\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617749,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617749\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.31063829787234043,\n \"acc_stderr\": 0.03025123757921317,\n\
\ \"acc_norm\": 0.31063829787234043,\n \"acc_norm_stderr\": 0.03025123757921317\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\
\ \"acc_stderr\": 0.04096985139843673,\n \"acc_norm\": 0.2543859649122807,\n\
\ \"acc_norm_stderr\": 0.04096985139843673\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2206896551724138,\n \"acc_stderr\": 0.03455930201924811,\n\
\ \"acc_norm\": 0.2206896551724138,\n \"acc_norm_stderr\": 0.03455930201924811\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.26455026455026454,\n \"acc_stderr\": 0.022717467897708624,\n \"\
acc_norm\": 0.26455026455026454,\n \"acc_norm_stderr\": 0.022717467897708624\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.18253968253968253,\n\
\ \"acc_stderr\": 0.03455071019102149,\n \"acc_norm\": 0.18253968253968253,\n\
\ \"acc_norm_stderr\": 0.03455071019102149\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2645161290322581,\n\
\ \"acc_stderr\": 0.025091892378859275,\n \"acc_norm\": 0.2645161290322581,\n\
\ \"acc_norm_stderr\": 0.025091892378859275\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.27586206896551724,\n \"acc_stderr\": 0.031447125816782405,\n\
\ \"acc_norm\": 0.27586206896551724,\n \"acc_norm_stderr\": 0.031447125816782405\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2545454545454545,\n \"acc_stderr\": 0.03401506715249039,\n\
\ \"acc_norm\": 0.2545454545454545,\n \"acc_norm_stderr\": 0.03401506715249039\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.2222222222222222,\n \"acc_stderr\": 0.029620227874790465,\n \"\
acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.029620227874790465\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.21243523316062177,\n \"acc_stderr\": 0.029519282616817247,\n\
\ \"acc_norm\": 0.21243523316062177,\n \"acc_norm_stderr\": 0.029519282616817247\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.22564102564102564,\n \"acc_stderr\": 0.021193632525148526,\n\
\ \"acc_norm\": 0.22564102564102564,\n \"acc_norm_stderr\": 0.021193632525148526\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24814814814814815,\n \"acc_stderr\": 0.026335739404055803,\n \
\ \"acc_norm\": 0.24814814814814815,\n \"acc_norm_stderr\": 0.026335739404055803\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.23109243697478993,\n \"acc_stderr\": 0.027381406927868956,\n\
\ \"acc_norm\": 0.23109243697478993,\n \"acc_norm_stderr\": 0.027381406927868956\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2251655629139073,\n \"acc_stderr\": 0.03410435282008936,\n \"\
acc_norm\": 0.2251655629139073,\n \"acc_norm_stderr\": 0.03410435282008936\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.22935779816513763,\n \"acc_stderr\": 0.018025349724618684,\n \"\
acc_norm\": 0.22935779816513763,\n \"acc_norm_stderr\": 0.018025349724618684\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.16203703703703703,\n \"acc_stderr\": 0.02513045365226846,\n \"\
acc_norm\": 0.16203703703703703,\n \"acc_norm_stderr\": 0.02513045365226846\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.24509803921568626,\n \"acc_stderr\": 0.03019028245350194,\n \"\
acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.03019028245350194\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2869198312236287,\n \"acc_stderr\": 0.029443773022594693,\n \
\ \"acc_norm\": 0.2869198312236287,\n \"acc_norm_stderr\": 0.029443773022594693\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3901345291479821,\n\
\ \"acc_stderr\": 0.03273766725459157,\n \"acc_norm\": 0.3901345291479821,\n\
\ \"acc_norm_stderr\": 0.03273766725459157\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.22900763358778625,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.22900763358778625,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.38016528925619836,\n \"acc_stderr\": 0.04431324501968432,\n \"\
acc_norm\": 0.38016528925619836,\n \"acc_norm_stderr\": 0.04431324501968432\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.04330043749650743,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.04330043749650743\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2147239263803681,\n \"acc_stderr\": 0.03226219377286774,\n\
\ \"acc_norm\": 0.2147239263803681,\n \"acc_norm_stderr\": 0.03226219377286774\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n\
\ \"acc_stderr\": 0.04364226155841044,\n \"acc_norm\": 0.30357142857142855,\n\
\ \"acc_norm_stderr\": 0.04364226155841044\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.24271844660194175,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.24271844660194175,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2606837606837607,\n\
\ \"acc_stderr\": 0.028760348956523414,\n \"acc_norm\": 0.2606837606837607,\n\
\ \"acc_norm_stderr\": 0.028760348956523414\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.31545338441890164,\n\
\ \"acc_stderr\": 0.016617501738763404,\n \"acc_norm\": 0.31545338441890164,\n\
\ \"acc_norm_stderr\": 0.016617501738763404\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2543352601156069,\n \"acc_stderr\": 0.023445826276545543,\n\
\ \"acc_norm\": 0.2543352601156069,\n \"acc_norm_stderr\": 0.023445826276545543\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2581699346405229,\n \"acc_stderr\": 0.025058503316958147,\n\
\ \"acc_norm\": 0.2581699346405229,\n \"acc_norm_stderr\": 0.025058503316958147\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2797427652733119,\n\
\ \"acc_stderr\": 0.02549425935069491,\n \"acc_norm\": 0.2797427652733119,\n\
\ \"acc_norm_stderr\": 0.02549425935069491\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.29012345679012347,\n \"acc_stderr\": 0.025251173936495022,\n\
\ \"acc_norm\": 0.29012345679012347,\n \"acc_norm_stderr\": 0.025251173936495022\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.28368794326241137,\n \"acc_stderr\": 0.02689170942834396,\n \
\ \"acc_norm\": 0.28368794326241137,\n \"acc_norm_stderr\": 0.02689170942834396\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24641460234680573,\n\
\ \"acc_stderr\": 0.011005971399927242,\n \"acc_norm\": 0.24641460234680573,\n\
\ \"acc_norm_stderr\": 0.011005971399927242\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.19117647058823528,\n \"acc_stderr\": 0.02388688192244036,\n\
\ \"acc_norm\": 0.19117647058823528,\n \"acc_norm_stderr\": 0.02388688192244036\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2565359477124183,\n \"acc_stderr\": 0.017667841612378988,\n \
\ \"acc_norm\": 0.2565359477124183,\n \"acc_norm_stderr\": 0.017667841612378988\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2909090909090909,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.2909090909090909,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.3224489795918367,\n \"acc_stderr\": 0.029923100563683913,\n\
\ \"acc_norm\": 0.3224489795918367,\n \"acc_norm_stderr\": 0.029923100563683913\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.25870646766169153,\n\
\ \"acc_stderr\": 0.030965903123573012,\n \"acc_norm\": 0.25870646766169153,\n\
\ \"acc_norm_stderr\": 0.030965903123573012\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3192771084337349,\n\
\ \"acc_stderr\": 0.0362933532994786,\n \"acc_norm\": 0.3192771084337349,\n\
\ \"acc_norm_stderr\": 0.0362933532994786\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.30994152046783624,\n \"acc_stderr\": 0.03546976959393163,\n\
\ \"acc_norm\": 0.30994152046783624,\n \"acc_norm_stderr\": 0.03546976959393163\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22766217870257038,\n\
\ \"mc1_stderr\": 0.014679255032111068,\n \"mc2\": 0.35726869770535075,\n\
\ \"mc2_stderr\": 0.013773730670830016\n }\n}\n```"
repo_url: https://huggingface.co/Xilabs/calypso-3b-alpha-v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_03T14_01_45.504923
path:
- '**/details_harness|arc:challenge|25_2023-10-03T14-01-45.504923.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-03T14-01-45.504923.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_03T14_01_45.504923
path:
- '**/details_harness|hellaswag|10_2023-10-03T14-01-45.504923.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-03T14-01-45.504923.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_03T14_01_45.504923
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T14-01-45.504923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T14-01-45.504923.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T14-01-45.504923.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_03T14_01_45.504923
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T14-01-45.504923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T14-01-45.504923.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_03T14_01_45.504923
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T14-01-45.504923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T14-01-45.504923.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_03T14_01_45.504923
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T14-01-45.504923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T14-01-45.504923.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_03T14_01_45.504923
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T14-01-45.504923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T14-01-45.504923.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_03T14_01_45.504923
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T14-01-45.504923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T14-01-45.504923.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_03T14_01_45.504923
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T14-01-45.504923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T14-01-45.504923.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_03T14_01_45.504923
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T14-01-45.504923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T14-01-45.504923.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_03T14_01_45.504923
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T14-01-45.504923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T14-01-45.504923.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_03T14_01_45.504923
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T14-01-45.504923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T14-01-45.504923.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_03T14_01_45.504923
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T14-01-45.504923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T14-01-45.504923.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_03T14_01_45.504923
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T14-01-45.504923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T14-01-45.504923.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_03T14_01_45.504923
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T14-01-45.504923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T14-01-45.504923.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_03T14_01_45.504923
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T14-01-45.504923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T14-01-45.504923.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_03T14_01_45.504923
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T14-01-45.504923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T14-01-45.504923.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_03T14_01_45.504923
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T14-01-45.504923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T14-01-45.504923.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_03T14_01_45.504923
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T14-01-45.504923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T14-01-45.504923.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_03T14_01_45.504923
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T14-01-45.504923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T14-01-45.504923.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_03T14_01_45.504923
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T14-01-45.504923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T14-01-45.504923.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_03T14_01_45.504923
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T14-01-45.504923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T14-01-45.504923.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_03T14_01_45.504923
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T14-01-45.504923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T14-01-45.504923.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_03T14_01_45.504923
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T14-01-45.504923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T14-01-45.504923.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_03T14_01_45.504923
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T14-01-45.504923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T14-01-45.504923.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_03T14_01_45.504923
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T14-01-45.504923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T14-01-45.504923.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_03T14_01_45.504923
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T14-01-45.504923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T14-01-45.504923.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_03T14_01_45.504923
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T14-01-45.504923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T14-01-45.504923.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_03T14_01_45.504923
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T14-01-45.504923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T14-01-45.504923.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_03T14_01_45.504923
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T14-01-45.504923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T14-01-45.504923.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_03T14_01_45.504923
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T14-01-45.504923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T14-01-45.504923.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_03T14_01_45.504923
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T14-01-45.504923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T14-01-45.504923.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_03T14_01_45.504923
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T14-01-45.504923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T14-01-45.504923.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_03T14_01_45.504923
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T14-01-45.504923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T14-01-45.504923.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_03T14_01_45.504923
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T14-01-45.504923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T14-01-45.504923.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_03T14_01_45.504923
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T14-01-45.504923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T14-01-45.504923.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_03T14_01_45.504923
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T14-01-45.504923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T14-01-45.504923.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_03T14_01_45.504923
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T14-01-45.504923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T14-01-45.504923.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_03T14_01_45.504923
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T14-01-45.504923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T14-01-45.504923.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_03T14_01_45.504923
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T14-01-45.504923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T14-01-45.504923.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_03T14_01_45.504923
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T14-01-45.504923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T14-01-45.504923.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_03T14_01_45.504923
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T14-01-45.504923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T14-01-45.504923.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_03T14_01_45.504923
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T14-01-45.504923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T14-01-45.504923.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_03T14_01_45.504923
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T14-01-45.504923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T14-01-45.504923.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_03T14_01_45.504923
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T14-01-45.504923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T14-01-45.504923.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_03T14_01_45.504923
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T14-01-45.504923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T14-01-45.504923.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_03T14_01_45.504923
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T14-01-45.504923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T14-01-45.504923.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_03T14_01_45.504923
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T14-01-45.504923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T14-01-45.504923.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_03T14_01_45.504923
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T14-01-45.504923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T14-01-45.504923.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_03T14_01_45.504923
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T14-01-45.504923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T14-01-45.504923.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_03T14_01_45.504923
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T14-01-45.504923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T14-01-45.504923.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_03T14_01_45.504923
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T14-01-45.504923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T14-01-45.504923.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_03T14_01_45.504923
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T14-01-45.504923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T14-01-45.504923.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_03T14_01_45.504923
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T14-01-45.504923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T14-01-45.504923.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_03T14_01_45.504923
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T14-01-45.504923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T14-01-45.504923.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_03T14_01_45.504923
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T14-01-45.504923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T14-01-45.504923.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_03T14_01_45.504923
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T14-01-45.504923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T14-01-45.504923.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_03T14_01_45.504923
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T14-01-45.504923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T14-01-45.504923.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_03T14_01_45.504923
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T14-01-45.504923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T14-01-45.504923.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_03T14_01_45.504923
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T14-01-45.504923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T14-01-45.504923.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_03T14_01_45.504923
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T14-01-45.504923.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T14-01-45.504923.parquet'
- config_name: results
data_files:
- split: 2023_10_03T14_01_45.504923
path:
- results_2023-10-03T14-01-45.504923.parquet
- split: latest
path:
- results_2023-10-03T14-01-45.504923.parquet
---
# Dataset Card for Evaluation run of Xilabs/calypso-3b-alpha-v2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Xilabs/calypso-3b-alpha-v2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Xilabs/calypso-3b-alpha-v2](https://huggingface.co/Xilabs/calypso-3b-alpha-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Xilabs__calypso-3b-alpha-v2",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-03T14:01:45.504923](https://huggingface.co/datasets/open-llm-leaderboard/details_Xilabs__calypso-3b-alpha-v2/blob/main/results_2023-10-03T14-01-45.504923.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2649272647272492,
"acc_stderr": 0.031801100956562335,
"acc_norm": 0.26857734769514896,
"acc_norm_stderr": 0.03179637444357793,
"mc1": 0.22766217870257038,
"mc1_stderr": 0.014679255032111068,
"mc2": 0.35726869770535075,
"mc2_stderr": 0.013773730670830016
},
"harness|arc:challenge|25": {
"acc": 0.38310580204778155,
"acc_stderr": 0.014206472661672884,
"acc_norm": 0.41552901023890787,
"acc_norm_stderr": 0.014401366641216384
},
"harness|hellaswag|10": {
"acc": 0.5318661621190998,
"acc_stderr": 0.004979637330230314,
"acc_norm": 0.714797849034057,
"acc_norm_stderr": 0.004505879084606856
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768081,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768081
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.03944624162501116,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.03944624162501116
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.035834961763610625,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.035834961763610625
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.27169811320754716,
"acc_stderr": 0.027377706624670713,
"acc_norm": 0.27169811320754716,
"acc_norm_stderr": 0.027377706624670713
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.1597222222222222,
"acc_stderr": 0.030635578972093274,
"acc_norm": 0.1597222222222222,
"acc_norm_stderr": 0.030635578972093274
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.18497109826589594,
"acc_stderr": 0.02960562398177123,
"acc_norm": 0.18497109826589594,
"acc_norm_stderr": 0.02960562398177123
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617749,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617749
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.31063829787234043,
"acc_stderr": 0.03025123757921317,
"acc_norm": 0.31063829787234043,
"acc_norm_stderr": 0.03025123757921317
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.04096985139843673,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.04096985139843673
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2206896551724138,
"acc_stderr": 0.03455930201924811,
"acc_norm": 0.2206896551724138,
"acc_norm_stderr": 0.03455930201924811
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.26455026455026454,
"acc_stderr": 0.022717467897708624,
"acc_norm": 0.26455026455026454,
"acc_norm_stderr": 0.022717467897708624
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.18253968253968253,
"acc_stderr": 0.03455071019102149,
"acc_norm": 0.18253968253968253,
"acc_norm_stderr": 0.03455071019102149
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2645161290322581,
"acc_stderr": 0.025091892378859275,
"acc_norm": 0.2645161290322581,
"acc_norm_stderr": 0.025091892378859275
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.27586206896551724,
"acc_stderr": 0.031447125816782405,
"acc_norm": 0.27586206896551724,
"acc_norm_stderr": 0.031447125816782405
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2545454545454545,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.2545454545454545,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.029620227874790465,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.029620227874790465
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.21243523316062177,
"acc_stderr": 0.029519282616817247,
"acc_norm": 0.21243523316062177,
"acc_norm_stderr": 0.029519282616817247
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.22564102564102564,
"acc_stderr": 0.021193632525148526,
"acc_norm": 0.22564102564102564,
"acc_norm_stderr": 0.021193632525148526
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24814814814814815,
"acc_stderr": 0.026335739404055803,
"acc_norm": 0.24814814814814815,
"acc_norm_stderr": 0.026335739404055803
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.23109243697478993,
"acc_stderr": 0.027381406927868956,
"acc_norm": 0.23109243697478993,
"acc_norm_stderr": 0.027381406927868956
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2251655629139073,
"acc_stderr": 0.03410435282008936,
"acc_norm": 0.2251655629139073,
"acc_norm_stderr": 0.03410435282008936
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.22935779816513763,
"acc_stderr": 0.018025349724618684,
"acc_norm": 0.22935779816513763,
"acc_norm_stderr": 0.018025349724618684
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.16203703703703703,
"acc_stderr": 0.02513045365226846,
"acc_norm": 0.16203703703703703,
"acc_norm_stderr": 0.02513045365226846
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.03019028245350194,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.03019028245350194
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2869198312236287,
"acc_stderr": 0.029443773022594693,
"acc_norm": 0.2869198312236287,
"acc_norm_stderr": 0.029443773022594693
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3901345291479821,
"acc_stderr": 0.03273766725459157,
"acc_norm": 0.3901345291479821,
"acc_norm_stderr": 0.03273766725459157
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.22900763358778625,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.22900763358778625,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.38016528925619836,
"acc_stderr": 0.04431324501968432,
"acc_norm": 0.38016528925619836,
"acc_norm_stderr": 0.04431324501968432
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.04330043749650743,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.04330043749650743
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2147239263803681,
"acc_stderr": 0.03226219377286774,
"acc_norm": 0.2147239263803681,
"acc_norm_stderr": 0.03226219377286774
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.30357142857142855,
"acc_stderr": 0.04364226155841044,
"acc_norm": 0.30357142857142855,
"acc_norm_stderr": 0.04364226155841044
},
"harness|hendrycksTest-management|5": {
"acc": 0.24271844660194175,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.24271844660194175,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2606837606837607,
"acc_stderr": 0.028760348956523414,
"acc_norm": 0.2606837606837607,
"acc_norm_stderr": 0.028760348956523414
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.31545338441890164,
"acc_stderr": 0.016617501738763404,
"acc_norm": 0.31545338441890164,
"acc_norm_stderr": 0.016617501738763404
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2543352601156069,
"acc_stderr": 0.023445826276545543,
"acc_norm": 0.2543352601156069,
"acc_norm_stderr": 0.023445826276545543
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2581699346405229,
"acc_stderr": 0.025058503316958147,
"acc_norm": 0.2581699346405229,
"acc_norm_stderr": 0.025058503316958147
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2797427652733119,
"acc_stderr": 0.02549425935069491,
"acc_norm": 0.2797427652733119,
"acc_norm_stderr": 0.02549425935069491
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.29012345679012347,
"acc_stderr": 0.025251173936495022,
"acc_norm": 0.29012345679012347,
"acc_norm_stderr": 0.025251173936495022
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.28368794326241137,
"acc_stderr": 0.02689170942834396,
"acc_norm": 0.28368794326241137,
"acc_norm_stderr": 0.02689170942834396
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24641460234680573,
"acc_stderr": 0.011005971399927242,
"acc_norm": 0.24641460234680573,
"acc_norm_stderr": 0.011005971399927242
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.19117647058823528,
"acc_stderr": 0.02388688192244036,
"acc_norm": 0.19117647058823528,
"acc_norm_stderr": 0.02388688192244036
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2565359477124183,
"acc_stderr": 0.017667841612378988,
"acc_norm": 0.2565359477124183,
"acc_norm_stderr": 0.017667841612378988
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2909090909090909,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.2909090909090909,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3224489795918367,
"acc_stderr": 0.029923100563683913,
"acc_norm": 0.3224489795918367,
"acc_norm_stderr": 0.029923100563683913
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.25870646766169153,
"acc_stderr": 0.030965903123573012,
"acc_norm": 0.25870646766169153,
"acc_norm_stderr": 0.030965903123573012
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3192771084337349,
"acc_stderr": 0.0362933532994786,
"acc_norm": 0.3192771084337349,
"acc_norm_stderr": 0.0362933532994786
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.30994152046783624,
"acc_stderr": 0.03546976959393163,
"acc_norm": 0.30994152046783624,
"acc_norm_stderr": 0.03546976959393163
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22766217870257038,
"mc1_stderr": 0.014679255032111068,
"mc2": 0.35726869770535075,
"mc2_stderr": 0.013773730670830016
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
weaviate/WithRetrieval-Random-Train-80 | 2023-10-03T14:03:15.000Z | [
"license:apache-2.0",
"region:us"
] | weaviate | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
Mxode/Baike-Astronomy-ZH | 2023-10-03T14:19:38.000Z | [
"task_categories:text-generation",
"size_categories:n<1K",
"language:zh",
"license:apache-2.0",
"astronomy",
"region:us"
] | Mxode | null | null | null | 0 | 0 | ---
license: apache-2.0
task_categories:
- text-generation
language:
- zh
tags:
- astronomy
size_categories:
- n<1K
---
天文学百科,包含 8 个子目录,约 1000 条词条、110,0000 个字符。
数据包含一级目录、二级目录、标题、内容。其中**内容已经处理为单行**,且**文本普遍较长**。
一个样例如下:
```json
{
"top_category": "天文学",
"sub_category": "天体力学",
"title": "万有引力定律",
"content": "万有引力定律(汉语拼音:wàn yǒu yǐn lì zhī dìng lǜ),(universal gravitation,law of),自然界中任何两个质点都相互吸引,这个力同两个质点的质量的乘积成正比,同它们之间的距离的二次方成反比。如用m1、m2表示两质点的质量,r表示两质点间的距离,F表示作用力的值,则F=Gm1m2/r2,式中的G是比例常量,称万有引力常量或牛顿引力常量,数值因不同单位制而异,在国际单位制中G为6.672×1011牛顿·米2/千克2。这个定律由牛顿于1687年在《原理》上首次发表,它和牛顿运动定律一起,构成了牛顿力学特别是天体力学的基础。\n 在牛顿公布该定律之前,胡克、惠更斯都曾根据开普勒定律推测行星和太阳间存在和距离二次方成反比的引力,但未能提出数学证明,为此胡克还和牛顿通过信,因此对定律的首创权有过争议。牛顿还曾对晚年的忘年交斯多克雷说过,1666年他在家乡避瘟疫时,曾因见苹果从树上落地而想到地球对苹果的引力是否可延伸到月球。此说传布很广,许多科学家深信不疑,并对牛顿为何推迟20年才发表有种种推测。但也有人根据牛顿晚年的精神状态,认为他对斯多克雷所说的并非真情。\n 一般物体之间的引力,在物体尺度远小于质心距离时,可视为质点;尺度和间距相近时,须视为质点系,用积分法求引力。但牛顿已算出一个密度均匀的圆球对附近质点的引力同把圆球的质量集中于球心时完全一致。对万有引力的起因,牛顿未作解释,把它视为超距力或以太的作用,系后人所为。爱因斯坦在广义相对论中将引力归之于时空曲率的变化。"
}
``` |
weaviate/WithRetrieval-SchemaSplit-Train-80 | 2023-10-03T14:09:52.000Z | [
"license:apache-2.0",
"region:us"
] | weaviate | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
weaviate/WithRetrieval-SchemaSplit-Test-80 | 2023-10-03T14:10:25.000Z | [
"license:apache-2.0",
"region:us"
] | weaviate | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
weaviate/WithRetrieval-SchemaSplit-Train-40 | 2023-10-03T14:11:04.000Z | [
"license:apache-2.0",
"region:us"
] | weaviate | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
weaviate/WithRetrieval-SchemaSplit-Test-40 | 2023-10-03T14:12:06.000Z | [
"license:apache-2.0",
"region:us"
] | weaviate | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
weaviate/WithRetrieval-SchemaSplit-Train-20 | 2023-10-03T14:12:36.000Z | [
"license:apache-2.0",
"region:us"
] | weaviate | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
weaviate/WithRetrieval-SchemaSplit-Test-20 | 2023-10-03T14:13:21.000Z | [
"license:apache-2.0",
"region:us"
] | weaviate | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_winglian__basilisk-4b | 2023-10-03T14:18:13.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of winglian/basilisk-4b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [winglian/basilisk-4b](https://huggingface.co/winglian/basilisk-4b) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_winglian__basilisk-4b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-03T14:16:48.676759](https://huggingface.co/datasets/open-llm-leaderboard/details_winglian__basilisk-4b/blob/main/results_2023-10-03T14-16-48.676759.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.24737563255578193,\n\
\ \"acc_stderr\": 0.031058513226002612,\n \"acc_norm\": 0.24886993783532474,\n\
\ \"acc_norm_stderr\": 0.031068344367473282,\n \"mc1\": 0.25091799265605874,\n\
\ \"mc1_stderr\": 0.015176985027707682,\n \"mc2\": 0.43739201042404435,\n\
\ \"mc2_stderr\": 0.015087141124007967\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.2354948805460751,\n \"acc_stderr\": 0.012399451855004752,\n\
\ \"acc_norm\": 0.25853242320819114,\n \"acc_norm_stderr\": 0.012794553754288679\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.3309101772555268,\n\
\ \"acc_stderr\": 0.004695791340502856,\n \"acc_norm\": 0.39603664608643696,\n\
\ \"acc_norm_stderr\": 0.004880726787988633\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2814814814814815,\n\
\ \"acc_stderr\": 0.03885004245800254,\n \"acc_norm\": 0.2814814814814815,\n\
\ \"acc_norm_stderr\": 0.03885004245800254\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.18421052631578946,\n \"acc_stderr\": 0.031546980450822305,\n\
\ \"acc_norm\": 0.18421052631578946,\n \"acc_norm_stderr\": 0.031546980450822305\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.27,\n\
\ \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \
\ \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.27169811320754716,\n \"acc_stderr\": 0.027377706624670713,\n\
\ \"acc_norm\": 0.27169811320754716,\n \"acc_norm_stderr\": 0.027377706624670713\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2152777777777778,\n\
\ \"acc_stderr\": 0.03437079344106135,\n \"acc_norm\": 0.2152777777777778,\n\
\ \"acc_norm_stderr\": 0.03437079344106135\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.19,\n \"acc_stderr\": 0.039427724440366234,\n \
\ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.039427724440366234\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.15,\n \"acc_stderr\": 0.03588702812826372,\n \"acc_norm\"\
: 0.15,\n \"acc_norm_stderr\": 0.03588702812826372\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2023121387283237,\n\
\ \"acc_stderr\": 0.03063114553919882,\n \"acc_norm\": 0.2023121387283237,\n\
\ \"acc_norm_stderr\": 0.03063114553919882\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.17647058823529413,\n \"acc_stderr\": 0.03793281185307809,\n\
\ \"acc_norm\": 0.17647058823529413,\n \"acc_norm_stderr\": 0.03793281185307809\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.24,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2680851063829787,\n \"acc_stderr\": 0.028957342788342343,\n\
\ \"acc_norm\": 0.2680851063829787,\n \"acc_norm_stderr\": 0.028957342788342343\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\
\ \"acc_stderr\": 0.04049339297748141,\n \"acc_norm\": 0.24561403508771928,\n\
\ \"acc_norm_stderr\": 0.04049339297748141\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.20689655172413793,\n \"acc_stderr\": 0.03375672449560554,\n\
\ \"acc_norm\": 0.20689655172413793,\n \"acc_norm_stderr\": 0.03375672449560554\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"\
acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.1746031746031746,\n\
\ \"acc_stderr\": 0.033954900208561116,\n \"acc_norm\": 0.1746031746031746,\n\
\ \"acc_norm_stderr\": 0.033954900208561116\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.26129032258064516,\n\
\ \"acc_stderr\": 0.02499305339776483,\n \"acc_norm\": 0.26129032258064516,\n\
\ \"acc_norm_stderr\": 0.02499305339776483\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2561576354679803,\n \"acc_stderr\": 0.030712730070982592,\n\
\ \"acc_norm\": 0.2561576354679803,\n \"acc_norm_stderr\": 0.030712730070982592\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\"\
: 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2606060606060606,\n \"acc_stderr\": 0.034277431758165236,\n\
\ \"acc_norm\": 0.2606060606060606,\n \"acc_norm_stderr\": 0.034277431758165236\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.2222222222222222,\n \"acc_stderr\": 0.029620227874790486,\n \"\
acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.029620227874790486\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.2849740932642487,\n \"acc_stderr\": 0.03257714077709661,\n\
\ \"acc_norm\": 0.2849740932642487,\n \"acc_norm_stderr\": 0.03257714077709661\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2564102564102564,\n \"acc_stderr\": 0.02213908110397153,\n \
\ \"acc_norm\": 0.2564102564102564,\n \"acc_norm_stderr\": 0.02213908110397153\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.27037037037037037,\n \"acc_stderr\": 0.027080372815145668,\n \
\ \"acc_norm\": 0.27037037037037037,\n \"acc_norm_stderr\": 0.027080372815145668\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.24789915966386555,\n \"acc_stderr\": 0.028047967224176896,\n\
\ \"acc_norm\": 0.24789915966386555,\n \"acc_norm_stderr\": 0.028047967224176896\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.17880794701986755,\n \"acc_stderr\": 0.03128744850600725,\n \"\
acc_norm\": 0.17880794701986755,\n \"acc_norm_stderr\": 0.03128744850600725\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.23302752293577983,\n \"acc_stderr\": 0.0181256691808615,\n \"\
acc_norm\": 0.23302752293577983,\n \"acc_norm_stderr\": 0.0181256691808615\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4537037037037037,\n \"acc_stderr\": 0.033953227263757976,\n \"\
acc_norm\": 0.4537037037037037,\n \"acc_norm_stderr\": 0.033953227263757976\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.20588235294117646,\n \"acc_stderr\": 0.028379449451588667,\n \"\
acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.028379449451588667\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.26582278481012656,\n \"acc_stderr\": 0.028756799629658342,\n \
\ \"acc_norm\": 0.26582278481012656,\n \"acc_norm_stderr\": 0.028756799629658342\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3721973094170404,\n\
\ \"acc_stderr\": 0.032443052830087304,\n \"acc_norm\": 0.3721973094170404,\n\
\ \"acc_norm_stderr\": 0.032443052830087304\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2366412213740458,\n \"acc_stderr\": 0.0372767357559692,\n\
\ \"acc_norm\": 0.2366412213740458,\n \"acc_norm_stderr\": 0.0372767357559692\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.256198347107438,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\"\
: 0.256198347107438,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.19444444444444445,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.19444444444444445,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.18404907975460122,\n \"acc_stderr\": 0.03044677768797173,\n\
\ \"acc_norm\": 0.18404907975460122,\n \"acc_norm_stderr\": 0.03044677768797173\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.19642857142857142,\n\
\ \"acc_stderr\": 0.03770970049347019,\n \"acc_norm\": 0.19642857142857142,\n\
\ \"acc_norm_stderr\": 0.03770970049347019\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.18446601941747573,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.18446601941747573,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2863247863247863,\n\
\ \"acc_stderr\": 0.02961432369045665,\n \"acc_norm\": 0.2863247863247863,\n\
\ \"acc_norm_stderr\": 0.02961432369045665\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.27458492975734355,\n\
\ \"acc_stderr\": 0.01595982993308404,\n \"acc_norm\": 0.27458492975734355,\n\
\ \"acc_norm_stderr\": 0.01595982993308404\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.23410404624277456,\n \"acc_stderr\": 0.02279711027807114,\n\
\ \"acc_norm\": 0.23410404624277456,\n \"acc_norm_stderr\": 0.02279711027807114\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23016759776536314,\n\
\ \"acc_stderr\": 0.014078339253425826,\n \"acc_norm\": 0.23016759776536314,\n\
\ \"acc_norm_stderr\": 0.014078339253425826\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.21241830065359477,\n \"acc_stderr\": 0.02342037547829613,\n\
\ \"acc_norm\": 0.21241830065359477,\n \"acc_norm_stderr\": 0.02342037547829613\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2508038585209003,\n\
\ \"acc_stderr\": 0.02461977195669716,\n \"acc_norm\": 0.2508038585209003,\n\
\ \"acc_norm_stderr\": 0.02461977195669716\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.27469135802469136,\n \"acc_stderr\": 0.024836057868294677,\n\
\ \"acc_norm\": 0.27469135802469136,\n \"acc_norm_stderr\": 0.024836057868294677\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.25886524822695034,\n \"acc_stderr\": 0.026129572527180848,\n \
\ \"acc_norm\": 0.25886524822695034,\n \"acc_norm_stderr\": 0.026129572527180848\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2405475880052151,\n\
\ \"acc_stderr\": 0.010916406735478947,\n \"acc_norm\": 0.2405475880052151,\n\
\ \"acc_norm_stderr\": 0.010916406735478947\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.3713235294117647,\n \"acc_stderr\": 0.02934980313976587,\n\
\ \"acc_norm\": 0.3713235294117647,\n \"acc_norm_stderr\": 0.02934980313976587\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2647058823529412,\n \"acc_stderr\": 0.017848089574913226,\n \
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.017848089574913226\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.20909090909090908,\n\
\ \"acc_stderr\": 0.03895091015724137,\n \"acc_norm\": 0.20909090909090908,\n\
\ \"acc_norm_stderr\": 0.03895091015724137\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.02560737598657916,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.02560737598657916\n },\n\
\ \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23383084577114427,\n\
\ \"acc_stderr\": 0.02992941540834839,\n \"acc_norm\": 0.23383084577114427,\n\
\ \"acc_norm_stderr\": 0.02992941540834839\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384739,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384739\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3253012048192771,\n\
\ \"acc_stderr\": 0.03647168523683228,\n \"acc_norm\": 0.3253012048192771,\n\
\ \"acc_norm_stderr\": 0.03647168523683228\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.1871345029239766,\n \"acc_stderr\": 0.029913127232368036,\n\
\ \"acc_norm\": 0.1871345029239766,\n \"acc_norm_stderr\": 0.029913127232368036\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.25091799265605874,\n\
\ \"mc1_stderr\": 0.015176985027707682,\n \"mc2\": 0.43739201042404435,\n\
\ \"mc2_stderr\": 0.015087141124007967\n }\n}\n```"
repo_url: https://huggingface.co/winglian/basilisk-4b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_03T14_16_48.676759
path:
- '**/details_harness|arc:challenge|25_2023-10-03T14-16-48.676759.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-03T14-16-48.676759.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_03T14_16_48.676759
path:
- '**/details_harness|hellaswag|10_2023-10-03T14-16-48.676759.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-03T14-16-48.676759.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_03T14_16_48.676759
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T14-16-48.676759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T14-16-48.676759.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T14-16-48.676759.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_03T14_16_48.676759
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T14-16-48.676759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T14-16-48.676759.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_03T14_16_48.676759
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T14-16-48.676759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T14-16-48.676759.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_03T14_16_48.676759
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T14-16-48.676759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T14-16-48.676759.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_03T14_16_48.676759
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T14-16-48.676759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T14-16-48.676759.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_03T14_16_48.676759
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T14-16-48.676759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T14-16-48.676759.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_03T14_16_48.676759
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T14-16-48.676759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T14-16-48.676759.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_03T14_16_48.676759
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T14-16-48.676759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T14-16-48.676759.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_03T14_16_48.676759
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T14-16-48.676759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T14-16-48.676759.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_03T14_16_48.676759
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T14-16-48.676759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T14-16-48.676759.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_03T14_16_48.676759
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T14-16-48.676759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T14-16-48.676759.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_03T14_16_48.676759
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T14-16-48.676759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T14-16-48.676759.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_03T14_16_48.676759
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T14-16-48.676759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T14-16-48.676759.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_03T14_16_48.676759
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T14-16-48.676759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T14-16-48.676759.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_03T14_16_48.676759
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T14-16-48.676759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T14-16-48.676759.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_03T14_16_48.676759
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T14-16-48.676759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T14-16-48.676759.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_03T14_16_48.676759
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T14-16-48.676759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T14-16-48.676759.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_03T14_16_48.676759
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T14-16-48.676759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T14-16-48.676759.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_03T14_16_48.676759
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T14-16-48.676759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T14-16-48.676759.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_03T14_16_48.676759
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T14-16-48.676759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T14-16-48.676759.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_03T14_16_48.676759
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T14-16-48.676759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T14-16-48.676759.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_03T14_16_48.676759
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T14-16-48.676759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T14-16-48.676759.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_03T14_16_48.676759
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T14-16-48.676759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T14-16-48.676759.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_03T14_16_48.676759
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T14-16-48.676759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T14-16-48.676759.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_03T14_16_48.676759
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T14-16-48.676759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T14-16-48.676759.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_03T14_16_48.676759
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T14-16-48.676759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T14-16-48.676759.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_03T14_16_48.676759
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T14-16-48.676759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T14-16-48.676759.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_03T14_16_48.676759
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T14-16-48.676759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T14-16-48.676759.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_03T14_16_48.676759
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T14-16-48.676759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T14-16-48.676759.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_03T14_16_48.676759
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T14-16-48.676759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T14-16-48.676759.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_03T14_16_48.676759
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T14-16-48.676759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T14-16-48.676759.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_03T14_16_48.676759
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T14-16-48.676759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T14-16-48.676759.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_03T14_16_48.676759
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T14-16-48.676759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T14-16-48.676759.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_03T14_16_48.676759
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T14-16-48.676759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T14-16-48.676759.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_03T14_16_48.676759
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T14-16-48.676759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T14-16-48.676759.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_03T14_16_48.676759
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T14-16-48.676759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T14-16-48.676759.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_03T14_16_48.676759
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T14-16-48.676759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T14-16-48.676759.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_03T14_16_48.676759
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T14-16-48.676759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T14-16-48.676759.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_03T14_16_48.676759
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T14-16-48.676759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T14-16-48.676759.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_03T14_16_48.676759
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T14-16-48.676759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T14-16-48.676759.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_03T14_16_48.676759
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T14-16-48.676759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T14-16-48.676759.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_03T14_16_48.676759
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T14-16-48.676759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T14-16-48.676759.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_03T14_16_48.676759
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T14-16-48.676759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T14-16-48.676759.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_03T14_16_48.676759
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T14-16-48.676759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T14-16-48.676759.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_03T14_16_48.676759
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T14-16-48.676759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T14-16-48.676759.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_03T14_16_48.676759
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T14-16-48.676759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T14-16-48.676759.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_03T14_16_48.676759
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T14-16-48.676759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T14-16-48.676759.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_03T14_16_48.676759
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T14-16-48.676759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T14-16-48.676759.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_03T14_16_48.676759
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T14-16-48.676759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T14-16-48.676759.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_03T14_16_48.676759
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T14-16-48.676759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T14-16-48.676759.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_03T14_16_48.676759
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T14-16-48.676759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T14-16-48.676759.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_03T14_16_48.676759
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T14-16-48.676759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T14-16-48.676759.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_03T14_16_48.676759
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T14-16-48.676759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T14-16-48.676759.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_03T14_16_48.676759
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T14-16-48.676759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T14-16-48.676759.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_03T14_16_48.676759
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T14-16-48.676759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T14-16-48.676759.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_03T14_16_48.676759
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T14-16-48.676759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T14-16-48.676759.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_03T14_16_48.676759
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T14-16-48.676759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T14-16-48.676759.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_03T14_16_48.676759
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T14-16-48.676759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T14-16-48.676759.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_03T14_16_48.676759
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T14-16-48.676759.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T14-16-48.676759.parquet'
- config_name: results
data_files:
- split: 2023_10_03T14_16_48.676759
path:
- results_2023-10-03T14-16-48.676759.parquet
- split: latest
path:
- results_2023-10-03T14-16-48.676759.parquet
---
# Dataset Card for Evaluation run of winglian/basilisk-4b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/winglian/basilisk-4b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [winglian/basilisk-4b](https://huggingface.co/winglian/basilisk-4b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_winglian__basilisk-4b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-03T14:16:48.676759](https://huggingface.co/datasets/open-llm-leaderboard/details_winglian__basilisk-4b/blob/main/results_2023-10-03T14-16-48.676759.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.24737563255578193,
"acc_stderr": 0.031058513226002612,
"acc_norm": 0.24886993783532474,
"acc_norm_stderr": 0.031068344367473282,
"mc1": 0.25091799265605874,
"mc1_stderr": 0.015176985027707682,
"mc2": 0.43739201042404435,
"mc2_stderr": 0.015087141124007967
},
"harness|arc:challenge|25": {
"acc": 0.2354948805460751,
"acc_stderr": 0.012399451855004752,
"acc_norm": 0.25853242320819114,
"acc_norm_stderr": 0.012794553754288679
},
"harness|hellaswag|10": {
"acc": 0.3309101772555268,
"acc_stderr": 0.004695791340502856,
"acc_norm": 0.39603664608643696,
"acc_norm_stderr": 0.004880726787988633
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.03885004245800254,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.03885004245800254
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.18421052631578946,
"acc_stderr": 0.031546980450822305,
"acc_norm": 0.18421052631578946,
"acc_norm_stderr": 0.031546980450822305
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.27169811320754716,
"acc_stderr": 0.027377706624670713,
"acc_norm": 0.27169811320754716,
"acc_norm_stderr": 0.027377706624670713
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2152777777777778,
"acc_stderr": 0.03437079344106135,
"acc_norm": 0.2152777777777778,
"acc_norm_stderr": 0.03437079344106135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.19,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.19,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.15,
"acc_stderr": 0.03588702812826372,
"acc_norm": 0.15,
"acc_norm_stderr": 0.03588702812826372
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2023121387283237,
"acc_stderr": 0.03063114553919882,
"acc_norm": 0.2023121387283237,
"acc_norm_stderr": 0.03063114553919882
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.17647058823529413,
"acc_stderr": 0.03793281185307809,
"acc_norm": 0.17647058823529413,
"acc_norm_stderr": 0.03793281185307809
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2680851063829787,
"acc_stderr": 0.028957342788342343,
"acc_norm": 0.2680851063829787,
"acc_norm_stderr": 0.028957342788342343
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.04049339297748141,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.04049339297748141
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.20689655172413793,
"acc_stderr": 0.03375672449560554,
"acc_norm": 0.20689655172413793,
"acc_norm_stderr": 0.03375672449560554
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.1746031746031746,
"acc_stderr": 0.033954900208561116,
"acc_norm": 0.1746031746031746,
"acc_norm_stderr": 0.033954900208561116
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.26129032258064516,
"acc_stderr": 0.02499305339776483,
"acc_norm": 0.26129032258064516,
"acc_norm_stderr": 0.02499305339776483
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2561576354679803,
"acc_stderr": 0.030712730070982592,
"acc_norm": 0.2561576354679803,
"acc_norm_stderr": 0.030712730070982592
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2606060606060606,
"acc_stderr": 0.034277431758165236,
"acc_norm": 0.2606060606060606,
"acc_norm_stderr": 0.034277431758165236
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.029620227874790486,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.029620227874790486
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.2849740932642487,
"acc_stderr": 0.03257714077709661,
"acc_norm": 0.2849740932642487,
"acc_norm_stderr": 0.03257714077709661
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2564102564102564,
"acc_stderr": 0.02213908110397153,
"acc_norm": 0.2564102564102564,
"acc_norm_stderr": 0.02213908110397153
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.27037037037037037,
"acc_stderr": 0.027080372815145668,
"acc_norm": 0.27037037037037037,
"acc_norm_stderr": 0.027080372815145668
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.24789915966386555,
"acc_stderr": 0.028047967224176896,
"acc_norm": 0.24789915966386555,
"acc_norm_stderr": 0.028047967224176896
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.17880794701986755,
"acc_stderr": 0.03128744850600725,
"acc_norm": 0.17880794701986755,
"acc_norm_stderr": 0.03128744850600725
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.23302752293577983,
"acc_stderr": 0.0181256691808615,
"acc_norm": 0.23302752293577983,
"acc_norm_stderr": 0.0181256691808615
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4537037037037037,
"acc_stderr": 0.033953227263757976,
"acc_norm": 0.4537037037037037,
"acc_norm_stderr": 0.033953227263757976
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.028379449451588667,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.028379449451588667
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.26582278481012656,
"acc_stderr": 0.028756799629658342,
"acc_norm": 0.26582278481012656,
"acc_norm_stderr": 0.028756799629658342
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3721973094170404,
"acc_stderr": 0.032443052830087304,
"acc_norm": 0.3721973094170404,
"acc_norm_stderr": 0.032443052830087304
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2366412213740458,
"acc_stderr": 0.0372767357559692,
"acc_norm": 0.2366412213740458,
"acc_norm_stderr": 0.0372767357559692
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.256198347107438,
"acc_stderr": 0.03984979653302872,
"acc_norm": 0.256198347107438,
"acc_norm_stderr": 0.03984979653302872
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.19444444444444445,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.19444444444444445,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.18404907975460122,
"acc_stderr": 0.03044677768797173,
"acc_norm": 0.18404907975460122,
"acc_norm_stderr": 0.03044677768797173
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.19642857142857142,
"acc_stderr": 0.03770970049347019,
"acc_norm": 0.19642857142857142,
"acc_norm_stderr": 0.03770970049347019
},
"harness|hendrycksTest-management|5": {
"acc": 0.18446601941747573,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.18446601941747573,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2863247863247863,
"acc_stderr": 0.02961432369045665,
"acc_norm": 0.2863247863247863,
"acc_norm_stderr": 0.02961432369045665
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.27458492975734355,
"acc_stderr": 0.01595982993308404,
"acc_norm": 0.27458492975734355,
"acc_norm_stderr": 0.01595982993308404
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.23410404624277456,
"acc_stderr": 0.02279711027807114,
"acc_norm": 0.23410404624277456,
"acc_norm_stderr": 0.02279711027807114
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23016759776536314,
"acc_stderr": 0.014078339253425826,
"acc_norm": 0.23016759776536314,
"acc_norm_stderr": 0.014078339253425826
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.21241830065359477,
"acc_stderr": 0.02342037547829613,
"acc_norm": 0.21241830065359477,
"acc_norm_stderr": 0.02342037547829613
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2508038585209003,
"acc_stderr": 0.02461977195669716,
"acc_norm": 0.2508038585209003,
"acc_norm_stderr": 0.02461977195669716
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.27469135802469136,
"acc_stderr": 0.024836057868294677,
"acc_norm": 0.27469135802469136,
"acc_norm_stderr": 0.024836057868294677
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.25886524822695034,
"acc_stderr": 0.026129572527180848,
"acc_norm": 0.25886524822695034,
"acc_norm_stderr": 0.026129572527180848
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2405475880052151,
"acc_stderr": 0.010916406735478947,
"acc_norm": 0.2405475880052151,
"acc_norm_stderr": 0.010916406735478947
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3713235294117647,
"acc_stderr": 0.02934980313976587,
"acc_norm": 0.3713235294117647,
"acc_norm_stderr": 0.02934980313976587
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.017848089574913226,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.017848089574913226
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.20909090909090908,
"acc_stderr": 0.03895091015724137,
"acc_norm": 0.20909090909090908,
"acc_norm_stderr": 0.03895091015724137
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2,
"acc_stderr": 0.02560737598657916,
"acc_norm": 0.2,
"acc_norm_stderr": 0.02560737598657916
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23383084577114427,
"acc_stderr": 0.02992941540834839,
"acc_norm": 0.23383084577114427,
"acc_norm_stderr": 0.02992941540834839
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3253012048192771,
"acc_stderr": 0.03647168523683228,
"acc_norm": 0.3253012048192771,
"acc_norm_stderr": 0.03647168523683228
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.1871345029239766,
"acc_stderr": 0.029913127232368036,
"acc_norm": 0.1871345029239766,
"acc_norm_stderr": 0.029913127232368036
},
"harness|truthfulqa:mc|0": {
"mc1": 0.25091799265605874,
"mc1_stderr": 0.015176985027707682,
"mc2": 0.43739201042404435,
"mc2_stderr": 0.015087141124007967
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_42MARU__sitebunny-13b | 2023-10-03T14:19:39.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of 42MARU/sitebunny-13b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [42MARU/sitebunny-13b](https://huggingface.co/42MARU/sitebunny-13b) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_42MARU__sitebunny-13b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-03T14:18:14.630504](https://huggingface.co/datasets/open-llm-leaderboard/details_42MARU__sitebunny-13b/blob/main/results_2023-10-03T14-18-14.630504.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.599892576612954,\n\
\ \"acc_stderr\": 0.033862051169313345,\n \"acc_norm\": 0.6036466135566739,\n\
\ \"acc_norm_stderr\": 0.033840263660533464,\n \"mc1\": 0.3990208078335373,\n\
\ \"mc1_stderr\": 0.017142825728496763,\n \"mc2\": 0.5620657173642927,\n\
\ \"mc2_stderr\": 0.015466745090276375\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6040955631399317,\n \"acc_stderr\": 0.014291228393536588,\n\
\ \"acc_norm\": 0.6313993174061433,\n \"acc_norm_stderr\": 0.014097810678042192\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6422027484564827,\n\
\ \"acc_stderr\": 0.0047837237982865,\n \"acc_norm\": 0.836387173869747,\n\
\ \"acc_norm_stderr\": 0.0036916784957679717\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5185185185185185,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.5185185185185185,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5657894736842105,\n \"acc_stderr\": 0.0403356566784832,\n\
\ \"acc_norm\": 0.5657894736842105,\n \"acc_norm_stderr\": 0.0403356566784832\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5962264150943396,\n \"acc_stderr\": 0.030197611600197946,\n\
\ \"acc_norm\": 0.5962264150943396,\n \"acc_norm_stderr\": 0.030197611600197946\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6944444444444444,\n\
\ \"acc_stderr\": 0.03852084696008534,\n \"acc_norm\": 0.6944444444444444,\n\
\ \"acc_norm_stderr\": 0.03852084696008534\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n\
\ \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5895953757225434,\n\
\ \"acc_stderr\": 0.03750757044895537,\n \"acc_norm\": 0.5895953757225434,\n\
\ \"acc_norm_stderr\": 0.03750757044895537\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201942,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201942\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n\
\ \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4808510638297872,\n \"acc_stderr\": 0.03266204299064678,\n\
\ \"acc_norm\": 0.4808510638297872,\n \"acc_norm_stderr\": 0.03266204299064678\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3684210526315789,\n\
\ \"acc_stderr\": 0.04537815354939392,\n \"acc_norm\": 0.3684210526315789,\n\
\ \"acc_norm_stderr\": 0.04537815354939392\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.335978835978836,\n \"acc_stderr\": 0.024326310529149138,\n \"\
acc_norm\": 0.335978835978836,\n \"acc_norm_stderr\": 0.024326310529149138\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n\
\ \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n\
\ \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6645161290322581,\n\
\ \"acc_stderr\": 0.026860206444724342,\n \"acc_norm\": 0.6645161290322581,\n\
\ \"acc_norm_stderr\": 0.026860206444724342\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562427,\n \"acc_norm\"\
: 0.57,\n \"acc_norm_stderr\": 0.04975698519562427\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7393939393939394,\n \"acc_stderr\": 0.03427743175816524,\n\
\ \"acc_norm\": 0.7393939393939394,\n \"acc_norm_stderr\": 0.03427743175816524\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7828282828282829,\n \"acc_stderr\": 0.02937661648494562,\n \"\
acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.02937661648494562\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.024639789097709443,\n\
\ \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.024639789097709443\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6205128205128205,\n \"acc_stderr\": 0.02460362692409742,\n \
\ \"acc_norm\": 0.6205128205128205,\n \"acc_norm_stderr\": 0.02460362692409742\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34814814814814815,\n \"acc_stderr\": 0.02904560029061626,\n \
\ \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.02904560029061626\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.592436974789916,\n \"acc_stderr\": 0.03191863374478465,\n \
\ \"acc_norm\": 0.592436974789916,\n \"acc_norm_stderr\": 0.03191863374478465\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7908256880733945,\n \"acc_stderr\": 0.01743793717334323,\n \"\
acc_norm\": 0.7908256880733945,\n \"acc_norm_stderr\": 0.01743793717334323\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4398148148148148,\n \"acc_stderr\": 0.03385177976044811,\n \"\
acc_norm\": 0.4398148148148148,\n \"acc_norm_stderr\": 0.03385177976044811\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.025524722324553325,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.025524722324553325\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7721518987341772,\n \"acc_stderr\": 0.027303484599069432,\n \
\ \"acc_norm\": 0.7721518987341772,\n \"acc_norm_stderr\": 0.027303484599069432\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477518,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477518\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6793893129770993,\n \"acc_stderr\": 0.04093329229834278,\n\
\ \"acc_norm\": 0.6793893129770993,\n \"acc_norm_stderr\": 0.04093329229834278\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7272727272727273,\n \"acc_stderr\": 0.04065578140908705,\n \"\
acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04065578140908705\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n\
\ \"acc_stderr\": 0.03755265865037181,\n \"acc_norm\": 0.8148148148148148,\n\
\ \"acc_norm_stderr\": 0.03755265865037181\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6932515337423313,\n \"acc_stderr\": 0.03623089915724146,\n\
\ \"acc_norm\": 0.6932515337423313,\n \"acc_norm_stderr\": 0.03623089915724146\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n\
\ \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.39285714285714285,\n\
\ \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260594,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260594\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8461538461538461,\n\
\ \"acc_stderr\": 0.02363687331748929,\n \"acc_norm\": 0.8461538461538461,\n\
\ \"acc_norm_stderr\": 0.02363687331748929\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7943805874840357,\n\
\ \"acc_stderr\": 0.01445250045678583,\n \"acc_norm\": 0.7943805874840357,\n\
\ \"acc_norm_stderr\": 0.01445250045678583\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.025416003773165545,\n\
\ \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.025416003773165545\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5251396648044693,\n\
\ \"acc_stderr\": 0.016701350842682625,\n \"acc_norm\": 0.5251396648044693,\n\
\ \"acc_norm_stderr\": 0.016701350842682625\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6601307189542484,\n \"acc_stderr\": 0.027121956071388852,\n\
\ \"acc_norm\": 0.6601307189542484,\n \"acc_norm_stderr\": 0.027121956071388852\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6816720257234726,\n\
\ \"acc_stderr\": 0.026457225067811025,\n \"acc_norm\": 0.6816720257234726,\n\
\ \"acc_norm_stderr\": 0.026457225067811025\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.691358024691358,\n \"acc_stderr\": 0.025702640260603753,\n\
\ \"acc_norm\": 0.691358024691358,\n \"acc_norm_stderr\": 0.025702640260603753\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4491525423728814,\n\
\ \"acc_stderr\": 0.01270403051885148,\n \"acc_norm\": 0.4491525423728814,\n\
\ \"acc_norm_stderr\": 0.01270403051885148\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5919117647058824,\n \"acc_stderr\": 0.029855261393483924,\n\
\ \"acc_norm\": 0.5919117647058824,\n \"acc_norm_stderr\": 0.029855261393483924\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5947712418300654,\n \"acc_stderr\": 0.019861155193829156,\n \
\ \"acc_norm\": 0.5947712418300654,\n \"acc_norm_stderr\": 0.019861155193829156\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6775510204081633,\n \"acc_stderr\": 0.02992310056368391,\n\
\ \"acc_norm\": 0.6775510204081633,\n \"acc_norm_stderr\": 0.02992310056368391\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7761194029850746,\n\
\ \"acc_stderr\": 0.029475250236017193,\n \"acc_norm\": 0.7761194029850746,\n\
\ \"acc_norm_stderr\": 0.029475250236017193\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.035887028128263686,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.035887028128263686\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n\
\ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n\
\ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.03094445977853321,\n\
\ \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.03094445977853321\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3990208078335373,\n\
\ \"mc1_stderr\": 0.017142825728496763,\n \"mc2\": 0.5620657173642927,\n\
\ \"mc2_stderr\": 0.015466745090276375\n }\n}\n```"
repo_url: https://huggingface.co/42MARU/sitebunny-13b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_03T14_18_14.630504
path:
- '**/details_harness|arc:challenge|25_2023-10-03T14-18-14.630504.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-03T14-18-14.630504.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_03T14_18_14.630504
path:
- '**/details_harness|hellaswag|10_2023-10-03T14-18-14.630504.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-03T14-18-14.630504.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_03T14_18_14.630504
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T14-18-14.630504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T14-18-14.630504.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T14-18-14.630504.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_03T14_18_14.630504
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T14-18-14.630504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T14-18-14.630504.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_03T14_18_14.630504
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T14-18-14.630504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T14-18-14.630504.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_03T14_18_14.630504
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T14-18-14.630504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T14-18-14.630504.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_03T14_18_14.630504
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T14-18-14.630504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T14-18-14.630504.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_03T14_18_14.630504
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T14-18-14.630504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T14-18-14.630504.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_03T14_18_14.630504
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T14-18-14.630504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T14-18-14.630504.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_03T14_18_14.630504
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T14-18-14.630504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T14-18-14.630504.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_03T14_18_14.630504
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T14-18-14.630504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T14-18-14.630504.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_03T14_18_14.630504
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T14-18-14.630504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T14-18-14.630504.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_03T14_18_14.630504
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T14-18-14.630504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T14-18-14.630504.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_03T14_18_14.630504
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T14-18-14.630504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T14-18-14.630504.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_03T14_18_14.630504
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T14-18-14.630504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T14-18-14.630504.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_03T14_18_14.630504
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T14-18-14.630504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T14-18-14.630504.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_03T14_18_14.630504
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T14-18-14.630504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T14-18-14.630504.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_03T14_18_14.630504
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T14-18-14.630504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T14-18-14.630504.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_03T14_18_14.630504
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T14-18-14.630504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T14-18-14.630504.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_03T14_18_14.630504
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T14-18-14.630504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T14-18-14.630504.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_03T14_18_14.630504
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T14-18-14.630504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T14-18-14.630504.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_03T14_18_14.630504
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T14-18-14.630504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T14-18-14.630504.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_03T14_18_14.630504
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T14-18-14.630504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T14-18-14.630504.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_03T14_18_14.630504
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T14-18-14.630504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T14-18-14.630504.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_03T14_18_14.630504
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T14-18-14.630504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T14-18-14.630504.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_03T14_18_14.630504
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T14-18-14.630504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T14-18-14.630504.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_03T14_18_14.630504
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T14-18-14.630504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T14-18-14.630504.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_03T14_18_14.630504
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T14-18-14.630504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T14-18-14.630504.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_03T14_18_14.630504
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T14-18-14.630504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T14-18-14.630504.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_03T14_18_14.630504
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T14-18-14.630504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T14-18-14.630504.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_03T14_18_14.630504
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T14-18-14.630504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T14-18-14.630504.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_03T14_18_14.630504
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T14-18-14.630504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T14-18-14.630504.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_03T14_18_14.630504
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T14-18-14.630504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T14-18-14.630504.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_03T14_18_14.630504
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T14-18-14.630504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T14-18-14.630504.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_03T14_18_14.630504
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T14-18-14.630504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T14-18-14.630504.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_03T14_18_14.630504
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T14-18-14.630504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T14-18-14.630504.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_03T14_18_14.630504
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T14-18-14.630504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T14-18-14.630504.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_03T14_18_14.630504
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T14-18-14.630504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T14-18-14.630504.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_03T14_18_14.630504
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T14-18-14.630504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T14-18-14.630504.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_03T14_18_14.630504
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T14-18-14.630504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T14-18-14.630504.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_03T14_18_14.630504
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T14-18-14.630504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T14-18-14.630504.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_03T14_18_14.630504
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T14-18-14.630504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T14-18-14.630504.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_03T14_18_14.630504
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T14-18-14.630504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T14-18-14.630504.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_03T14_18_14.630504
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T14-18-14.630504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T14-18-14.630504.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_03T14_18_14.630504
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T14-18-14.630504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T14-18-14.630504.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_03T14_18_14.630504
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T14-18-14.630504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T14-18-14.630504.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_03T14_18_14.630504
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T14-18-14.630504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T14-18-14.630504.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_03T14_18_14.630504
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T14-18-14.630504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T14-18-14.630504.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_03T14_18_14.630504
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T14-18-14.630504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T14-18-14.630504.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_03T14_18_14.630504
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T14-18-14.630504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T14-18-14.630504.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_03T14_18_14.630504
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T14-18-14.630504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T14-18-14.630504.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_03T14_18_14.630504
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T14-18-14.630504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T14-18-14.630504.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_03T14_18_14.630504
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T14-18-14.630504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T14-18-14.630504.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_03T14_18_14.630504
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T14-18-14.630504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T14-18-14.630504.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_03T14_18_14.630504
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T14-18-14.630504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T14-18-14.630504.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_03T14_18_14.630504
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T14-18-14.630504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T14-18-14.630504.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_03T14_18_14.630504
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T14-18-14.630504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T14-18-14.630504.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_03T14_18_14.630504
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T14-18-14.630504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T14-18-14.630504.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_03T14_18_14.630504
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T14-18-14.630504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T14-18-14.630504.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_03T14_18_14.630504
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T14-18-14.630504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T14-18-14.630504.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_03T14_18_14.630504
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T14-18-14.630504.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T14-18-14.630504.parquet'
- config_name: results
data_files:
- split: 2023_10_03T14_18_14.630504
path:
- results_2023-10-03T14-18-14.630504.parquet
- split: latest
path:
- results_2023-10-03T14-18-14.630504.parquet
---
# Dataset Card for Evaluation run of 42MARU/sitebunny-13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/42MARU/sitebunny-13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [42MARU/sitebunny-13b](https://huggingface.co/42MARU/sitebunny-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_42MARU__sitebunny-13b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-03T14:18:14.630504](https://huggingface.co/datasets/open-llm-leaderboard/details_42MARU__sitebunny-13b/blob/main/results_2023-10-03T14-18-14.630504.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.599892576612954,
"acc_stderr": 0.033862051169313345,
"acc_norm": 0.6036466135566739,
"acc_norm_stderr": 0.033840263660533464,
"mc1": 0.3990208078335373,
"mc1_stderr": 0.017142825728496763,
"mc2": 0.5620657173642927,
"mc2_stderr": 0.015466745090276375
},
"harness|arc:challenge|25": {
"acc": 0.6040955631399317,
"acc_stderr": 0.014291228393536588,
"acc_norm": 0.6313993174061433,
"acc_norm_stderr": 0.014097810678042192
},
"harness|hellaswag|10": {
"acc": 0.6422027484564827,
"acc_stderr": 0.0047837237982865,
"acc_norm": 0.836387173869747,
"acc_norm_stderr": 0.0036916784957679717
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5657894736842105,
"acc_stderr": 0.0403356566784832,
"acc_norm": 0.5657894736842105,
"acc_norm_stderr": 0.0403356566784832
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5962264150943396,
"acc_stderr": 0.030197611600197946,
"acc_norm": 0.5962264150943396,
"acc_norm_stderr": 0.030197611600197946
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.03852084696008534,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.03852084696008534
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5895953757225434,
"acc_stderr": 0.03750757044895537,
"acc_norm": 0.5895953757225434,
"acc_norm_stderr": 0.03750757044895537
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04690650298201942,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04690650298201942
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4808510638297872,
"acc_stderr": 0.03266204299064678,
"acc_norm": 0.4808510638297872,
"acc_norm_stderr": 0.03266204299064678
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3684210526315789,
"acc_stderr": 0.04537815354939392,
"acc_norm": 0.3684210526315789,
"acc_norm_stderr": 0.04537815354939392
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.335978835978836,
"acc_stderr": 0.024326310529149138,
"acc_norm": 0.335978835978836,
"acc_norm_stderr": 0.024326310529149138
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6645161290322581,
"acc_stderr": 0.026860206444724342,
"acc_norm": 0.6645161290322581,
"acc_norm_stderr": 0.026860206444724342
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562427,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562427
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7393939393939394,
"acc_stderr": 0.03427743175816524,
"acc_norm": 0.7393939393939394,
"acc_norm_stderr": 0.03427743175816524
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.02937661648494562,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.02937661648494562
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.024639789097709443,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.024639789097709443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6205128205128205,
"acc_stderr": 0.02460362692409742,
"acc_norm": 0.6205128205128205,
"acc_norm_stderr": 0.02460362692409742
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.02904560029061626,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.02904560029061626
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.592436974789916,
"acc_stderr": 0.03191863374478465,
"acc_norm": 0.592436974789916,
"acc_norm_stderr": 0.03191863374478465
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587525,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7908256880733945,
"acc_stderr": 0.01743793717334323,
"acc_norm": 0.7908256880733945,
"acc_norm_stderr": 0.01743793717334323
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4398148148148148,
"acc_stderr": 0.03385177976044811,
"acc_norm": 0.4398148148148148,
"acc_norm_stderr": 0.03385177976044811
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.025524722324553325,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.025524722324553325
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7721518987341772,
"acc_stderr": 0.027303484599069432,
"acc_norm": 0.7721518987341772,
"acc_norm_stderr": 0.027303484599069432
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477518,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477518
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6793893129770993,
"acc_stderr": 0.04093329229834278,
"acc_norm": 0.6793893129770993,
"acc_norm_stderr": 0.04093329229834278
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04065578140908705,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04065578140908705
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037181,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037181
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6932515337423313,
"acc_stderr": 0.03623089915724146,
"acc_norm": 0.6932515337423313,
"acc_norm_stderr": 0.03623089915724146
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.39285714285714285,
"acc_stderr": 0.04635550135609976,
"acc_norm": 0.39285714285714285,
"acc_norm_stderr": 0.04635550135609976
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.04354631077260594,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.04354631077260594
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8461538461538461,
"acc_stderr": 0.02363687331748929,
"acc_norm": 0.8461538461538461,
"acc_norm_stderr": 0.02363687331748929
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7943805874840357,
"acc_stderr": 0.01445250045678583,
"acc_norm": 0.7943805874840357,
"acc_norm_stderr": 0.01445250045678583
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.025416003773165545,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.025416003773165545
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5251396648044693,
"acc_stderr": 0.016701350842682625,
"acc_norm": 0.5251396648044693,
"acc_norm_stderr": 0.016701350842682625
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6601307189542484,
"acc_stderr": 0.027121956071388852,
"acc_norm": 0.6601307189542484,
"acc_norm_stderr": 0.027121956071388852
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6816720257234726,
"acc_stderr": 0.026457225067811025,
"acc_norm": 0.6816720257234726,
"acc_norm_stderr": 0.026457225067811025
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.691358024691358,
"acc_stderr": 0.025702640260603753,
"acc_norm": 0.691358024691358,
"acc_norm_stderr": 0.025702640260603753
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4491525423728814,
"acc_stderr": 0.01270403051885148,
"acc_norm": 0.4491525423728814,
"acc_norm_stderr": 0.01270403051885148
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5919117647058824,
"acc_stderr": 0.029855261393483924,
"acc_norm": 0.5919117647058824,
"acc_norm_stderr": 0.029855261393483924
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5947712418300654,
"acc_stderr": 0.019861155193829156,
"acc_norm": 0.5947712418300654,
"acc_norm_stderr": 0.019861155193829156
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6775510204081633,
"acc_stderr": 0.02992310056368391,
"acc_norm": 0.6775510204081633,
"acc_norm_stderr": 0.02992310056368391
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7761194029850746,
"acc_stderr": 0.029475250236017193,
"acc_norm": 0.7761194029850746,
"acc_norm_stderr": 0.029475250236017193
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.035887028128263686,
"acc_norm": 0.85,
"acc_norm_stderr": 0.035887028128263686
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7953216374269005,
"acc_stderr": 0.03094445977853321,
"acc_norm": 0.7953216374269005,
"acc_norm_stderr": 0.03094445977853321
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3990208078335373,
"mc1_stderr": 0.017142825728496763,
"mc2": 0.5620657173642927,
"mc2_stderr": 0.015466745090276375
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
stevecrawshaw/cesap_epc | 2023-10-03T14:27:04.000Z | [
"language:en",
"license:gpl-3.0",
"energy",
"environment",
"region:us"
] | stevecrawshaw | null | null | null | 0 | 0 | ---
license: gpl-3.0
language:
- en
tags:
- energy
- environment
pretty_name: EPC Data
--- |
open-llm-leaderboard/details_winglian__llama-2-4b | 2023-10-03T14:23:57.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of winglian/llama-2-4b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [winglian/llama-2-4b](https://huggingface.co/winglian/llama-2-4b) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_winglian__llama-2-4b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-03T14:22:33.156570](https://huggingface.co/datasets/open-llm-leaderboard/details_winglian__llama-2-4b/blob/main/results_2023-10-03T14-22-33.156570.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.24568705178933328,\n\
\ \"acc_stderr\": 0.03119929408116822,\n \"acc_norm\": 0.24834373372059426,\n\
\ \"acc_norm_stderr\": 0.03120753730929993,\n \"mc1\": 0.22399020807833536,\n\
\ \"mc1_stderr\": 0.014594964329474203,\n \"mc2\": 0.38720203973910133,\n\
\ \"mc2_stderr\": 0.014108484248648084\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.28071672354948807,\n \"acc_stderr\": 0.01313123812697558,\n\
\ \"acc_norm\": 0.3122866894197952,\n \"acc_norm_stderr\": 0.013542598541688065\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4076877116112328,\n\
\ \"acc_stderr\": 0.004904002676184321,\n \"acc_norm\": 0.5328619796853217,\n\
\ \"acc_norm_stderr\": 0.004978992721242828\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.35555555555555557,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.35555555555555557,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n\
\ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.28,\n\
\ \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n \
\ \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.27547169811320754,\n \"acc_stderr\": 0.027495663683724077,\n\
\ \"acc_norm\": 0.27547169811320754,\n \"acc_norm_stderr\": 0.027495663683724077\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2023121387283237,\n\
\ \"acc_stderr\": 0.03063114553919882,\n \"acc_norm\": 0.2023121387283237,\n\
\ \"acc_norm_stderr\": 0.03063114553919882\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.27,\n\
\ \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2978723404255319,\n \"acc_stderr\": 0.02989614568209546,\n\
\ \"acc_norm\": 0.2978723404255319,\n \"acc_norm_stderr\": 0.02989614568209546\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.03999423879281337,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.03999423879281337\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.20689655172413793,\n \"acc_stderr\": 0.03375672449560554,\n\
\ \"acc_norm\": 0.20689655172413793,\n \"acc_norm_stderr\": 0.03375672449560554\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.24867724867724866,\n \"acc_stderr\": 0.02226181769240017,\n \"\
acc_norm\": 0.24867724867724866,\n \"acc_norm_stderr\": 0.02226181769240017\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.037184890068181146,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.037184890068181146\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.2129032258064516,\n \"acc_stderr\": 0.02328766512726853,\n \"\
acc_norm\": 0.2129032258064516,\n \"acc_norm_stderr\": 0.02328766512726853\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.24630541871921183,\n \"acc_stderr\": 0.030315099285617722,\n \"\
acc_norm\": 0.24630541871921183,\n \"acc_norm_stderr\": 0.030315099285617722\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\"\
: 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.22424242424242424,\n \"acc_stderr\": 0.032568666616811015,\n\
\ \"acc_norm\": 0.22424242424242424,\n \"acc_norm_stderr\": 0.032568666616811015\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.1919191919191919,\n \"acc_stderr\": 0.02805779167298901,\n \"\
acc_norm\": 0.1919191919191919,\n \"acc_norm_stderr\": 0.02805779167298901\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.21761658031088082,\n \"acc_stderr\": 0.029778663037752964,\n\
\ \"acc_norm\": 0.21761658031088082,\n \"acc_norm_stderr\": 0.029778663037752964\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.23076923076923078,\n \"acc_stderr\": 0.021362027725222735,\n\
\ \"acc_norm\": 0.23076923076923078,\n \"acc_norm_stderr\": 0.021362027725222735\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25925925925925924,\n \"acc_stderr\": 0.026719240783712163,\n \
\ \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.026719240783712163\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.026265024608275886,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.026265024608275886\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2052980132450331,\n \"acc_stderr\": 0.03297986648473834,\n \"\
acc_norm\": 0.2052980132450331,\n \"acc_norm_stderr\": 0.03297986648473834\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.21467889908256882,\n \"acc_stderr\": 0.017604304149256487,\n \"\
acc_norm\": 0.21467889908256882,\n \"acc_norm_stderr\": 0.017604304149256487\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.25925925925925924,\n \"acc_stderr\": 0.029886910547626957,\n \"\
acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.029886910547626957\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.23529411764705882,\n \"acc_stderr\": 0.02977177522814563,\n \"\
acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.02977177522814563\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.25738396624472576,\n \"acc_stderr\": 0.02845882099146031,\n \
\ \"acc_norm\": 0.25738396624472576,\n \"acc_norm_stderr\": 0.02845882099146031\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3004484304932735,\n\
\ \"acc_stderr\": 0.030769352008229143,\n \"acc_norm\": 0.3004484304932735,\n\
\ \"acc_norm_stderr\": 0.030769352008229143\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.24427480916030533,\n \"acc_stderr\": 0.03768335959728743,\n\
\ \"acc_norm\": 0.24427480916030533,\n \"acc_norm_stderr\": 0.03768335959728743\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.256198347107438,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\"\
: 0.256198347107438,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.26380368098159507,\n \"acc_stderr\": 0.03462419931615624,\n\
\ \"acc_norm\": 0.26380368098159507,\n \"acc_norm_stderr\": 0.03462419931615624\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.23214285714285715,\n\
\ \"acc_stderr\": 0.04007341809755807,\n \"acc_norm\": 0.23214285714285715,\n\
\ \"acc_norm_stderr\": 0.04007341809755807\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.02934311479809447,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.02934311479809447\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.24776500638569604,\n\
\ \"acc_stderr\": 0.015438083080568966,\n \"acc_norm\": 0.24776500638569604,\n\
\ \"acc_norm_stderr\": 0.015438083080568966\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24566473988439305,\n \"acc_stderr\": 0.02317629820399201,\n\
\ \"acc_norm\": 0.24566473988439305,\n \"acc_norm_stderr\": 0.02317629820399201\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24022346368715083,\n\
\ \"acc_stderr\": 0.014288343803925293,\n \"acc_norm\": 0.24022346368715083,\n\
\ \"acc_norm_stderr\": 0.014288343803925293\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.24836601307189543,\n \"acc_stderr\": 0.024739981355113596,\n\
\ \"acc_norm\": 0.24836601307189543,\n \"acc_norm_stderr\": 0.024739981355113596\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.20257234726688103,\n\
\ \"acc_stderr\": 0.022827317491059672,\n \"acc_norm\": 0.20257234726688103,\n\
\ \"acc_norm_stderr\": 0.022827317491059672\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.25617283950617287,\n \"acc_stderr\": 0.024288533637726095,\n\
\ \"acc_norm\": 0.25617283950617287,\n \"acc_norm_stderr\": 0.024288533637726095\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2553191489361702,\n \"acc_stderr\": 0.02601199293090202,\n \
\ \"acc_norm\": 0.2553191489361702,\n \"acc_norm_stderr\": 0.02601199293090202\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24641460234680573,\n\
\ \"acc_stderr\": 0.01100597139992723,\n \"acc_norm\": 0.24641460234680573,\n\
\ \"acc_norm_stderr\": 0.01100597139992723\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.16176470588235295,\n \"acc_stderr\": 0.022368672562886754,\n\
\ \"acc_norm\": 0.16176470588235295,\n \"acc_norm_stderr\": 0.022368672562886754\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.22727272727272727,\n \"acc_stderr\": 0.04013964554072775,\n\
\ \"acc_norm\": 0.22727272727272727,\n \"acc_norm_stderr\": 0.04013964554072775\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.1469387755102041,\n\
\ \"acc_stderr\": 0.02266540041721763,\n \"acc_norm\": 0.1469387755102041,\n\
\ \"acc_norm_stderr\": 0.02266540041721763\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n\
\ \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n\
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.30120481927710846,\n \"acc_stderr\": 0.035716092300534796,\n\
\ \"acc_norm\": 0.30120481927710846,\n \"acc_norm_stderr\": 0.035716092300534796\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.34502923976608185,\n\
\ \"acc_stderr\": 0.036459813773888065,\n \"acc_norm\": 0.34502923976608185,\n\
\ \"acc_norm_stderr\": 0.036459813773888065\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.22399020807833536,\n \"mc1_stderr\": 0.014594964329474203,\n\
\ \"mc2\": 0.38720203973910133,\n \"mc2_stderr\": 0.014108484248648084\n\
\ }\n}\n```"
repo_url: https://huggingface.co/winglian/llama-2-4b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|arc:challenge|25_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hellaswag|10_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T14-22-33.156570.parquet'
- config_name: results
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- results_2023-10-03T14-22-33.156570.parquet
- split: latest
path:
- results_2023-10-03T14-22-33.156570.parquet
---
# Dataset Card for Evaluation run of winglian/llama-2-4b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/winglian/llama-2-4b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [winglian/llama-2-4b](https://huggingface.co/winglian/llama-2-4b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_winglian__llama-2-4b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-03T14:22:33.156570](https://huggingface.co/datasets/open-llm-leaderboard/details_winglian__llama-2-4b/blob/main/results_2023-10-03T14-22-33.156570.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.24568705178933328,
"acc_stderr": 0.03119929408116822,
"acc_norm": 0.24834373372059426,
"acc_norm_stderr": 0.03120753730929993,
"mc1": 0.22399020807833536,
"mc1_stderr": 0.014594964329474203,
"mc2": 0.38720203973910133,
"mc2_stderr": 0.014108484248648084
},
"harness|arc:challenge|25": {
"acc": 0.28071672354948807,
"acc_stderr": 0.01313123812697558,
"acc_norm": 0.3122866894197952,
"acc_norm_stderr": 0.013542598541688065
},
"harness|hellaswag|10": {
"acc": 0.4076877116112328,
"acc_stderr": 0.004904002676184321,
"acc_norm": 0.5328619796853217,
"acc_norm_stderr": 0.004978992721242828
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.27547169811320754,
"acc_stderr": 0.027495663683724077,
"acc_norm": 0.27547169811320754,
"acc_norm_stderr": 0.027495663683724077
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.25,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2023121387283237,
"acc_stderr": 0.03063114553919882,
"acc_norm": 0.2023121387283237,
"acc_norm_stderr": 0.03063114553919882
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2978723404255319,
"acc_stderr": 0.02989614568209546,
"acc_norm": 0.2978723404255319,
"acc_norm_stderr": 0.02989614568209546
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.03999423879281337,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.03999423879281337
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.20689655172413793,
"acc_stderr": 0.03375672449560554,
"acc_norm": 0.20689655172413793,
"acc_norm_stderr": 0.03375672449560554
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24867724867724866,
"acc_stderr": 0.02226181769240017,
"acc_norm": 0.24867724867724866,
"acc_norm_stderr": 0.02226181769240017
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.037184890068181146,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.037184890068181146
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2129032258064516,
"acc_stderr": 0.02328766512726853,
"acc_norm": 0.2129032258064516,
"acc_norm_stderr": 0.02328766512726853
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.24630541871921183,
"acc_stderr": 0.030315099285617722,
"acc_norm": 0.24630541871921183,
"acc_norm_stderr": 0.030315099285617722
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.22424242424242424,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.22424242424242424,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.1919191919191919,
"acc_stderr": 0.02805779167298901,
"acc_norm": 0.1919191919191919,
"acc_norm_stderr": 0.02805779167298901
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.21761658031088082,
"acc_stderr": 0.029778663037752964,
"acc_norm": 0.21761658031088082,
"acc_norm_stderr": 0.029778663037752964
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.23076923076923078,
"acc_stderr": 0.021362027725222735,
"acc_norm": 0.23076923076923078,
"acc_norm_stderr": 0.021362027725222735
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.026719240783712163,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.026719240783712163
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.026265024608275886,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.026265024608275886
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2052980132450331,
"acc_stderr": 0.03297986648473834,
"acc_norm": 0.2052980132450331,
"acc_norm_stderr": 0.03297986648473834
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.21467889908256882,
"acc_stderr": 0.017604304149256487,
"acc_norm": 0.21467889908256882,
"acc_norm_stderr": 0.017604304149256487
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.029886910547626957,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.029886910547626957
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.02977177522814563,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.02977177522814563
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.25738396624472576,
"acc_stderr": 0.02845882099146031,
"acc_norm": 0.25738396624472576,
"acc_norm_stderr": 0.02845882099146031
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3004484304932735,
"acc_stderr": 0.030769352008229143,
"acc_norm": 0.3004484304932735,
"acc_norm_stderr": 0.030769352008229143
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.24427480916030533,
"acc_stderr": 0.03768335959728743,
"acc_norm": 0.24427480916030533,
"acc_norm_stderr": 0.03768335959728743
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.256198347107438,
"acc_stderr": 0.03984979653302872,
"acc_norm": 0.256198347107438,
"acc_norm_stderr": 0.03984979653302872
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.26380368098159507,
"acc_stderr": 0.03462419931615624,
"acc_norm": 0.26380368098159507,
"acc_norm_stderr": 0.03462419931615624
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.23214285714285715,
"acc_stderr": 0.04007341809755807,
"acc_norm": 0.23214285714285715,
"acc_norm_stderr": 0.04007341809755807
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.02934311479809447,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.02934311479809447
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.24776500638569604,
"acc_stderr": 0.015438083080568966,
"acc_norm": 0.24776500638569604,
"acc_norm_stderr": 0.015438083080568966
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24566473988439305,
"acc_stderr": 0.02317629820399201,
"acc_norm": 0.24566473988439305,
"acc_norm_stderr": 0.02317629820399201
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24022346368715083,
"acc_stderr": 0.014288343803925293,
"acc_norm": 0.24022346368715083,
"acc_norm_stderr": 0.014288343803925293
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.24836601307189543,
"acc_stderr": 0.024739981355113596,
"acc_norm": 0.24836601307189543,
"acc_norm_stderr": 0.024739981355113596
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.20257234726688103,
"acc_stderr": 0.022827317491059672,
"acc_norm": 0.20257234726688103,
"acc_norm_stderr": 0.022827317491059672
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.25617283950617287,
"acc_stderr": 0.024288533637726095,
"acc_norm": 0.25617283950617287,
"acc_norm_stderr": 0.024288533637726095
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2553191489361702,
"acc_stderr": 0.02601199293090202,
"acc_norm": 0.2553191489361702,
"acc_norm_stderr": 0.02601199293090202
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24641460234680573,
"acc_stderr": 0.01100597139992723,
"acc_norm": 0.24641460234680573,
"acc_norm_stderr": 0.01100597139992723
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.16176470588235295,
"acc_stderr": 0.022368672562886754,
"acc_norm": 0.16176470588235295,
"acc_norm_stderr": 0.022368672562886754
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.04013964554072775,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.04013964554072775
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.1469387755102041,
"acc_stderr": 0.02266540041721763,
"acc_norm": 0.1469387755102041,
"acc_norm_stderr": 0.02266540041721763
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-virology|5": {
"acc": 0.30120481927710846,
"acc_stderr": 0.035716092300534796,
"acc_norm": 0.30120481927710846,
"acc_norm_stderr": 0.035716092300534796
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.34502923976608185,
"acc_stderr": 0.036459813773888065,
"acc_norm": 0.34502923976608185,
"acc_norm_stderr": 0.036459813773888065
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22399020807833536,
"mc1_stderr": 0.014594964329474203,
"mc2": 0.38720203973910133,
"mc2_stderr": 0.014108484248648084
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Madjogger/JamSpell_dataset | 2023-10-07T17:44:56.000Z | [
"license:apache-2.0",
"region:us"
] | Madjogger | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_winglian__Llama-2-3b-hf | 2023-10-03T14:30:56.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of winglian/Llama-2-3b-hf
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [winglian/Llama-2-3b-hf](https://huggingface.co/winglian/Llama-2-3b-hf) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_winglian__Llama-2-3b-hf\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-03T14:29:31.026296](https://huggingface.co/datasets/open-llm-leaderboard/details_winglian__Llama-2-3b-hf/blob/main/results_2023-10-03T14-29-31.026296.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23360130874252474,\n\
\ \"acc_stderr\": 0.030790734157887217,\n \"acc_norm\": 0.23447771979274365,\n\
\ \"acc_norm_stderr\": 0.030804894045365534,\n \"mc1\": 0.2607099143206854,\n\
\ \"mc1_stderr\": 0.015368841620766365,\n \"mc2\": 0.5070910716215149,\n\
\ \"mc2_stderr\": 0.016388954007308647\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.2226962457337884,\n \"acc_stderr\": 0.012158314774829926,\n\
\ \"acc_norm\": 0.2696245733788396,\n \"acc_norm_stderr\": 0.012968040686869148\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2604062935670185,\n\
\ \"acc_stderr\": 0.004379594059141042,\n \"acc_norm\": 0.2651862178848835,\n\
\ \"acc_norm_stderr\": 0.004405301508322379\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165065,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165065\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.18518518518518517,\n\
\ \"acc_stderr\": 0.03355677216313142,\n \"acc_norm\": 0.18518518518518517,\n\
\ \"acc_norm_stderr\": 0.03355677216313142\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n\
\ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.29,\n\
\ \"acc_stderr\": 0.04560480215720683,\n \"acc_norm\": 0.29,\n \
\ \"acc_norm_stderr\": 0.04560480215720683\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.02528839450289137,\n\
\ \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.02528839450289137\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n\
\ \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.19653179190751446,\n\
\ \"acc_stderr\": 0.030299574664788137,\n \"acc_norm\": 0.19653179190751446,\n\
\ \"acc_norm_stderr\": 0.030299574664788137\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n\
\ \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\
\ \"acc_stderr\": 0.040969851398436695,\n \"acc_norm\": 0.2543859649122807,\n\
\ \"acc_norm_stderr\": 0.040969851398436695\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"\
acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23809523809523808,\n\
\ \"acc_stderr\": 0.03809523809523809,\n \"acc_norm\": 0.23809523809523808,\n\
\ \"acc_norm_stderr\": 0.03809523809523809\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.17,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.1774193548387097,\n\
\ \"acc_stderr\": 0.02173254068932927,\n \"acc_norm\": 0.1774193548387097,\n\
\ \"acc_norm_stderr\": 0.02173254068932927\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.15763546798029557,\n \"acc_stderr\": 0.025639014131172404,\n\
\ \"acc_norm\": 0.15763546798029557,\n \"acc_norm_stderr\": 0.025639014131172404\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\"\
: 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2727272727272727,\n \"acc_stderr\": 0.03477691162163659,\n\
\ \"acc_norm\": 0.2727272727272727,\n \"acc_norm_stderr\": 0.03477691162163659\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"\
acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n\
\ \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.20256410256410257,\n \"acc_stderr\": 0.020377660970371372,\n\
\ \"acc_norm\": 0.20256410256410257,\n \"acc_norm_stderr\": 0.020377660970371372\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \
\ \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n\
\ \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.19205298013245034,\n \"acc_stderr\": 0.03216298420593614,\n \"\
acc_norm\": 0.19205298013245034,\n \"acc_norm_stderr\": 0.03216298420593614\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.1871559633027523,\n \"acc_stderr\": 0.01672268452620016,\n \"\
acc_norm\": 0.1871559633027523,\n \"acc_norm_stderr\": 0.01672268452620016\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.14814814814814814,\n \"acc_stderr\": 0.024227629273728356,\n \"\
acc_norm\": 0.14814814814814814,\n \"acc_norm_stderr\": 0.024227629273728356\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25980392156862747,\n \"acc_stderr\": 0.030778554678693268,\n \"\
acc_norm\": 0.25980392156862747,\n \"acc_norm_stderr\": 0.030778554678693268\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n \
\ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2231404958677686,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.2231404958677686,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2392638036809816,\n \"acc_stderr\": 0.03351953879521269,\n\
\ \"acc_norm\": 0.2392638036809816,\n \"acc_norm_stderr\": 0.03351953879521269\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3482142857142857,\n\
\ \"acc_stderr\": 0.04521829902833585,\n \"acc_norm\": 0.3482142857142857,\n\
\ \"acc_norm_stderr\": 0.04521829902833585\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.18446601941747573,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.18446601941747573,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n\
\ \"acc_stderr\": 0.029745048572674043,\n \"acc_norm\": 0.2905982905982906,\n\
\ \"acc_norm_stderr\": 0.029745048572674043\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23499361430395913,\n\
\ \"acc_stderr\": 0.015162024152278434,\n \"acc_norm\": 0.23499361430395913,\n\
\ \"acc_norm_stderr\": 0.015162024152278434\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2347266881028939,\n\
\ \"acc_stderr\": 0.024071805887677045,\n \"acc_norm\": 0.2347266881028939,\n\
\ \"acc_norm_stderr\": 0.024071805887677045\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.21296296296296297,\n \"acc_stderr\": 0.022779719088733396,\n\
\ \"acc_norm\": 0.21296296296296297,\n \"acc_norm_stderr\": 0.022779719088733396\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.24113475177304963,\n \"acc_stderr\": 0.025518731049537755,\n \
\ \"acc_norm\": 0.24113475177304963,\n \"acc_norm_stderr\": 0.025518731049537755\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24315514993481094,\n\
\ \"acc_stderr\": 0.010956556654417362,\n \"acc_norm\": 0.24315514993481094,\n\
\ \"acc_norm_stderr\": 0.010956556654417362\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n\
\ \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.17959183673469387,\n\
\ \"acc_stderr\": 0.024573293589585637,\n \"acc_norm\": 0.17959183673469387,\n\
\ \"acc_norm_stderr\": 0.024573293589585637\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n\
\ \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n\
\ \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.27710843373493976,\n \"acc_stderr\": 0.034843315926805875,\n\
\ \"acc_norm\": 0.27710843373493976,\n \"acc_norm_stderr\": 0.034843315926805875\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.036155076303109344,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.036155076303109344\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.2607099143206854,\n \"mc1_stderr\": 0.015368841620766365,\n\
\ \"mc2\": 0.5070910716215149,\n \"mc2_stderr\": 0.016388954007308647\n\
\ }\n}\n```"
repo_url: https://huggingface.co/winglian/Llama-2-3b-hf
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_03T14_29_31.026296
path:
- '**/details_harness|arc:challenge|25_2023-10-03T14-29-31.026296.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-03T14-29-31.026296.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_03T14_29_31.026296
path:
- '**/details_harness|hellaswag|10_2023-10-03T14-29-31.026296.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-03T14-29-31.026296.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_03T14_29_31.026296
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T14-29-31.026296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T14-29-31.026296.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T14-29-31.026296.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_03T14_29_31.026296
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T14-29-31.026296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T14-29-31.026296.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_03T14_29_31.026296
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T14-29-31.026296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T14-29-31.026296.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_03T14_29_31.026296
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T14-29-31.026296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T14-29-31.026296.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_03T14_29_31.026296
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T14-29-31.026296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T14-29-31.026296.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_03T14_29_31.026296
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T14-29-31.026296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T14-29-31.026296.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_03T14_29_31.026296
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T14-29-31.026296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T14-29-31.026296.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_03T14_29_31.026296
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T14-29-31.026296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T14-29-31.026296.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_03T14_29_31.026296
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T14-29-31.026296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T14-29-31.026296.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_03T14_29_31.026296
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T14-29-31.026296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T14-29-31.026296.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_03T14_29_31.026296
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T14-29-31.026296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T14-29-31.026296.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_03T14_29_31.026296
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T14-29-31.026296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T14-29-31.026296.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_03T14_29_31.026296
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T14-29-31.026296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T14-29-31.026296.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_03T14_29_31.026296
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T14-29-31.026296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T14-29-31.026296.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_03T14_29_31.026296
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T14-29-31.026296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T14-29-31.026296.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_03T14_29_31.026296
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T14-29-31.026296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T14-29-31.026296.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_03T14_29_31.026296
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T14-29-31.026296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T14-29-31.026296.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_03T14_29_31.026296
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T14-29-31.026296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T14-29-31.026296.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_03T14_29_31.026296
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T14-29-31.026296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T14-29-31.026296.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_03T14_29_31.026296
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T14-29-31.026296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T14-29-31.026296.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_03T14_29_31.026296
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T14-29-31.026296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T14-29-31.026296.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_03T14_29_31.026296
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T14-29-31.026296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T14-29-31.026296.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_03T14_29_31.026296
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T14-29-31.026296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T14-29-31.026296.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_03T14_29_31.026296
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T14-29-31.026296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T14-29-31.026296.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_03T14_29_31.026296
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T14-29-31.026296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T14-29-31.026296.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_03T14_29_31.026296
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T14-29-31.026296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T14-29-31.026296.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_03T14_29_31.026296
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T14-29-31.026296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T14-29-31.026296.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_03T14_29_31.026296
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T14-29-31.026296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T14-29-31.026296.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_03T14_29_31.026296
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T14-29-31.026296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T14-29-31.026296.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_03T14_29_31.026296
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T14-29-31.026296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T14-29-31.026296.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_03T14_29_31.026296
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T14-29-31.026296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T14-29-31.026296.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_03T14_29_31.026296
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T14-29-31.026296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T14-29-31.026296.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_03T14_29_31.026296
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T14-29-31.026296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T14-29-31.026296.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_03T14_29_31.026296
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T14-29-31.026296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T14-29-31.026296.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_03T14_29_31.026296
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T14-29-31.026296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T14-29-31.026296.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_03T14_29_31.026296
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T14-29-31.026296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T14-29-31.026296.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_03T14_29_31.026296
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T14-29-31.026296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T14-29-31.026296.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_03T14_29_31.026296
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T14-29-31.026296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T14-29-31.026296.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_03T14_29_31.026296
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T14-29-31.026296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T14-29-31.026296.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_03T14_29_31.026296
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T14-29-31.026296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T14-29-31.026296.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_03T14_29_31.026296
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T14-29-31.026296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T14-29-31.026296.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_03T14_29_31.026296
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T14-29-31.026296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T14-29-31.026296.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_03T14_29_31.026296
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T14-29-31.026296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T14-29-31.026296.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_03T14_29_31.026296
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T14-29-31.026296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T14-29-31.026296.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_03T14_29_31.026296
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T14-29-31.026296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T14-29-31.026296.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_03T14_29_31.026296
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T14-29-31.026296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T14-29-31.026296.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_03T14_29_31.026296
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T14-29-31.026296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T14-29-31.026296.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_03T14_29_31.026296
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T14-29-31.026296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T14-29-31.026296.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_03T14_29_31.026296
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T14-29-31.026296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T14-29-31.026296.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_03T14_29_31.026296
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T14-29-31.026296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T14-29-31.026296.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_03T14_29_31.026296
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T14-29-31.026296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T14-29-31.026296.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_03T14_29_31.026296
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T14-29-31.026296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T14-29-31.026296.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_03T14_29_31.026296
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T14-29-31.026296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T14-29-31.026296.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_03T14_29_31.026296
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T14-29-31.026296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T14-29-31.026296.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_03T14_29_31.026296
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T14-29-31.026296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T14-29-31.026296.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_03T14_29_31.026296
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T14-29-31.026296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T14-29-31.026296.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_03T14_29_31.026296
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T14-29-31.026296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T14-29-31.026296.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_03T14_29_31.026296
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T14-29-31.026296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T14-29-31.026296.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_03T14_29_31.026296
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T14-29-31.026296.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T14-29-31.026296.parquet'
- config_name: results
data_files:
- split: 2023_10_03T14_29_31.026296
path:
- results_2023-10-03T14-29-31.026296.parquet
- split: latest
path:
- results_2023-10-03T14-29-31.026296.parquet
---
# Dataset Card for Evaluation run of winglian/Llama-2-3b-hf
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/winglian/Llama-2-3b-hf
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [winglian/Llama-2-3b-hf](https://huggingface.co/winglian/Llama-2-3b-hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_winglian__Llama-2-3b-hf",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-03T14:29:31.026296](https://huggingface.co/datasets/open-llm-leaderboard/details_winglian__Llama-2-3b-hf/blob/main/results_2023-10-03T14-29-31.026296.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.23360130874252474,
"acc_stderr": 0.030790734157887217,
"acc_norm": 0.23447771979274365,
"acc_norm_stderr": 0.030804894045365534,
"mc1": 0.2607099143206854,
"mc1_stderr": 0.015368841620766365,
"mc2": 0.5070910716215149,
"mc2_stderr": 0.016388954007308647
},
"harness|arc:challenge|25": {
"acc": 0.2226962457337884,
"acc_stderr": 0.012158314774829926,
"acc_norm": 0.2696245733788396,
"acc_norm_stderr": 0.012968040686869148
},
"harness|hellaswag|10": {
"acc": 0.2604062935670185,
"acc_stderr": 0.004379594059141042,
"acc_norm": 0.2651862178848835,
"acc_norm_stderr": 0.004405301508322379
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03355677216313142,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03355677216313142
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720683,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720683
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21509433962264152,
"acc_stderr": 0.02528839450289137,
"acc_norm": 0.21509433962264152,
"acc_norm_stderr": 0.02528839450289137
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.19653179190751446,
"acc_stderr": 0.030299574664788137,
"acc_norm": 0.19653179190751446,
"acc_norm_stderr": 0.030299574664788137
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.040969851398436695,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.040969851398436695
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.02094048156533486,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.02094048156533486
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23809523809523808,
"acc_stderr": 0.03809523809523809,
"acc_norm": 0.23809523809523808,
"acc_norm_stderr": 0.03809523809523809
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.17,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.17,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1774193548387097,
"acc_stderr": 0.02173254068932927,
"acc_norm": 0.1774193548387097,
"acc_norm_stderr": 0.02173254068932927
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15763546798029557,
"acc_stderr": 0.025639014131172404,
"acc_norm": 0.15763546798029557,
"acc_norm_stderr": 0.025639014131172404
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2727272727272727,
"acc_stderr": 0.03477691162163659,
"acc_norm": 0.2727272727272727,
"acc_norm_stderr": 0.03477691162163659
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860664,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860664
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.20256410256410257,
"acc_stderr": 0.020377660970371372,
"acc_norm": 0.20256410256410257,
"acc_norm_stderr": 0.020377660970371372
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.024882116857655075,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.024882116857655075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.19205298013245034,
"acc_stderr": 0.03216298420593614,
"acc_norm": 0.19205298013245034,
"acc_norm_stderr": 0.03216298420593614
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1871559633027523,
"acc_stderr": 0.01672268452620016,
"acc_norm": 0.1871559633027523,
"acc_norm_stderr": 0.01672268452620016
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.14814814814814814,
"acc_stderr": 0.024227629273728356,
"acc_norm": 0.14814814814814814,
"acc_norm_stderr": 0.024227629273728356
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25980392156862747,
"acc_stderr": 0.030778554678693268,
"acc_norm": 0.25980392156862747,
"acc_norm_stderr": 0.030778554678693268
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.31390134529147984,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.31390134529147984,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2231404958677686,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.2231404958677686,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2392638036809816,
"acc_stderr": 0.03351953879521269,
"acc_norm": 0.2392638036809816,
"acc_norm_stderr": 0.03351953879521269
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3482142857142857,
"acc_stderr": 0.04521829902833585,
"acc_norm": 0.3482142857142857,
"acc_norm_stderr": 0.04521829902833585
},
"harness|hendrycksTest-management|5": {
"acc": 0.18446601941747573,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.18446601941747573,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2905982905982906,
"acc_stderr": 0.029745048572674043,
"acc_norm": 0.2905982905982906,
"acc_norm_stderr": 0.029745048572674043
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23499361430395913,
"acc_stderr": 0.015162024152278434,
"acc_norm": 0.23499361430395913,
"acc_norm_stderr": 0.015162024152278434
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2347266881028939,
"acc_stderr": 0.024071805887677045,
"acc_norm": 0.2347266881028939,
"acc_norm_stderr": 0.024071805887677045
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21296296296296297,
"acc_stderr": 0.022779719088733396,
"acc_norm": 0.21296296296296297,
"acc_norm_stderr": 0.022779719088733396
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.24113475177304963,
"acc_stderr": 0.025518731049537755,
"acc_norm": 0.24113475177304963,
"acc_norm_stderr": 0.025518731049537755
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24315514993481094,
"acc_stderr": 0.010956556654417362,
"acc_norm": 0.24315514993481094,
"acc_norm_stderr": 0.010956556654417362
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.17959183673469387,
"acc_stderr": 0.024573293589585637,
"acc_norm": 0.17959183673469387,
"acc_norm_stderr": 0.024573293589585637
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-virology|5": {
"acc": 0.27710843373493976,
"acc_stderr": 0.034843315926805875,
"acc_norm": 0.27710843373493976,
"acc_norm_stderr": 0.034843315926805875
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.036155076303109344,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.036155076303109344
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2607099143206854,
"mc1_stderr": 0.015368841620766365,
"mc2": 0.5070910716215149,
"mc2_stderr": 0.016388954007308647
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
napatswift/budget-seq2seq | 2023-10-05T06:29:46.000Z | [
"region:us"
] | napatswift | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: line_item
sequence: string
- name: target
dtype: string
- name: format
dtype: string
splits:
- name: train
num_bytes: 134450572.0
num_examples: 21510
download_size: 23772061
dataset_size: 134450572.0
---
# Dataset Card for "budget-seq2seq-xml"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_dfurman__llama-2-7b-instruct-peft | 2023-10-03T14:31:01.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of dfurman/llama-2-7b-instruct-peft
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [dfurman/llama-2-7b-instruct-peft](https://huggingface.co/dfurman/llama-2-7b-instruct-peft)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_dfurman__llama-2-7b-instruct-peft\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-03T14:29:36.510142](https://huggingface.co/datasets/open-llm-leaderboard/details_dfurman__llama-2-7b-instruct-peft/blob/main/results_2023-10-03T14-29-36.510142.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4685145105933214,\n\
\ \"acc_stderr\": 0.03535588922202098,\n \"acc_norm\": 0.47253998227829414,\n\
\ \"acc_norm_stderr\": 0.03534192029601582,\n \"mc1\": 0.3243574051407589,\n\
\ \"mc1_stderr\": 0.016387976779647942,\n \"mc2\": 0.48497124472049463,\n\
\ \"mc2_stderr\": 0.014868370687930874\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.47440273037542663,\n \"acc_stderr\": 0.014592230885298964,\n\
\ \"acc_norm\": 0.5119453924914675,\n \"acc_norm_stderr\": 0.014607220340597167\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5892252539334794,\n\
\ \"acc_stderr\": 0.004909689876342043,\n \"acc_norm\": 0.7891854212308305,\n\
\ \"acc_norm_stderr\": 0.004070533786739672\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5037037037037037,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.5037037037037037,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4868421052631579,\n \"acc_stderr\": 0.04067533136309173,\n\
\ \"acc_norm\": 0.4868421052631579,\n \"acc_norm_stderr\": 0.04067533136309173\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.49,\n\
\ \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4867924528301887,\n \"acc_stderr\": 0.030762134874500482,\n\
\ \"acc_norm\": 0.4867924528301887,\n \"acc_norm_stderr\": 0.030762134874500482\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4305555555555556,\n\
\ \"acc_stderr\": 0.04140685639111503,\n \"acc_norm\": 0.4305555555555556,\n\
\ \"acc_norm_stderr\": 0.04140685639111503\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n\
\ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.45664739884393063,\n\
\ \"acc_stderr\": 0.03798106566014499,\n \"acc_norm\": 0.45664739884393063,\n\
\ \"acc_norm_stderr\": 0.03798106566014499\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.18627450980392157,\n \"acc_stderr\": 0.038739587141493524,\n\
\ \"acc_norm\": 0.18627450980392157,\n \"acc_norm_stderr\": 0.038739587141493524\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.58,\n \"acc_stderr\": 0.04960449637488583,\n \"acc_norm\": 0.58,\n\
\ \"acc_norm_stderr\": 0.04960449637488583\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4340425531914894,\n \"acc_stderr\": 0.03240038086792747,\n\
\ \"acc_norm\": 0.4340425531914894,\n \"acc_norm_stderr\": 0.03240038086792747\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3157894736842105,\n\
\ \"acc_stderr\": 0.043727482902780064,\n \"acc_norm\": 0.3157894736842105,\n\
\ \"acc_norm_stderr\": 0.043727482902780064\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4413793103448276,\n \"acc_stderr\": 0.04137931034482757,\n\
\ \"acc_norm\": 0.4413793103448276,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.31216931216931215,\n \"acc_stderr\": 0.023865206836972602,\n \"\
acc_norm\": 0.31216931216931215,\n \"acc_norm_stderr\": 0.023865206836972602\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.24603174603174602,\n\
\ \"acc_stderr\": 0.03852273364924315,\n \"acc_norm\": 0.24603174603174602,\n\
\ \"acc_norm_stderr\": 0.03852273364924315\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.028444006199428714,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.028444006199428714\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.35960591133004927,\n \"acc_stderr\": 0.033764582465095665,\n\
\ \"acc_norm\": 0.35960591133004927,\n \"acc_norm_stderr\": 0.033764582465095665\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\"\
: 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6121212121212121,\n \"acc_stderr\": 0.03804913653971011,\n\
\ \"acc_norm\": 0.6121212121212121,\n \"acc_norm_stderr\": 0.03804913653971011\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5353535353535354,\n \"acc_stderr\": 0.035534363688280626,\n \"\
acc_norm\": 0.5353535353535354,\n \"acc_norm_stderr\": 0.035534363688280626\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6632124352331606,\n \"acc_stderr\": 0.03410780251836184,\n\
\ \"acc_norm\": 0.6632124352331606,\n \"acc_norm_stderr\": 0.03410780251836184\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4256410256410256,\n \"acc_stderr\": 0.02506909438729654,\n \
\ \"acc_norm\": 0.4256410256410256,\n \"acc_norm_stderr\": 0.02506909438729654\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2777777777777778,\n \"acc_stderr\": 0.027309140588230193,\n \
\ \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.027309140588230193\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.44537815126050423,\n \"acc_stderr\": 0.0322841062671639,\n \
\ \"acc_norm\": 0.44537815126050423,\n \"acc_norm_stderr\": 0.0322841062671639\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6165137614678899,\n \"acc_stderr\": 0.02084715664191598,\n \"\
acc_norm\": 0.6165137614678899,\n \"acc_norm_stderr\": 0.02084715664191598\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3611111111111111,\n \"acc_stderr\": 0.03275773486100998,\n \"\
acc_norm\": 0.3611111111111111,\n \"acc_norm_stderr\": 0.03275773486100998\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6176470588235294,\n \"acc_stderr\": 0.0341078533890472,\n \"acc_norm\"\
: 0.6176470588235294,\n \"acc_norm_stderr\": 0.0341078533890472\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.6455696202531646,\n \"acc_stderr\": 0.031137304297185805,\n \"\
acc_norm\": 0.6455696202531646,\n \"acc_norm_stderr\": 0.031137304297185805\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5515695067264574,\n\
\ \"acc_stderr\": 0.03337883736255098,\n \"acc_norm\": 0.5515695067264574,\n\
\ \"acc_norm_stderr\": 0.03337883736255098\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.46564885496183206,\n \"acc_stderr\": 0.04374928560599738,\n\
\ \"acc_norm\": 0.46564885496183206,\n \"acc_norm_stderr\": 0.04374928560599738\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6528925619834711,\n \"acc_stderr\": 0.04345724570292534,\n \"\
acc_norm\": 0.6528925619834711,\n \"acc_norm_stderr\": 0.04345724570292534\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5277777777777778,\n\
\ \"acc_stderr\": 0.048262172941398944,\n \"acc_norm\": 0.5277777777777778,\n\
\ \"acc_norm_stderr\": 0.048262172941398944\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.43558282208588955,\n \"acc_stderr\": 0.038956324641389366,\n\
\ \"acc_norm\": 0.43558282208588955,\n \"acc_norm_stderr\": 0.038956324641389366\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n\
\ \"acc_stderr\": 0.04464285714285713,\n \"acc_norm\": 0.33035714285714285,\n\
\ \"acc_norm_stderr\": 0.04464285714285713\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5631067961165048,\n \"acc_stderr\": 0.04911147107365777,\n\
\ \"acc_norm\": 0.5631067961165048,\n \"acc_norm_stderr\": 0.04911147107365777\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6752136752136753,\n\
\ \"acc_stderr\": 0.03067902276549883,\n \"acc_norm\": 0.6752136752136753,\n\
\ \"acc_norm_stderr\": 0.03067902276549883\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6526181353767561,\n\
\ \"acc_stderr\": 0.01702667174865573,\n \"acc_norm\": 0.6526181353767561,\n\
\ \"acc_norm_stderr\": 0.01702667174865573\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5202312138728323,\n \"acc_stderr\": 0.026897049996382875,\n\
\ \"acc_norm\": 0.5202312138728323,\n \"acc_norm_stderr\": 0.026897049996382875\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27150837988826815,\n\
\ \"acc_stderr\": 0.014874252168095256,\n \"acc_norm\": 0.27150837988826815,\n\
\ \"acc_norm_stderr\": 0.014874252168095256\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.4934640522875817,\n \"acc_stderr\": 0.028627470550556054,\n\
\ \"acc_norm\": 0.4934640522875817,\n \"acc_norm_stderr\": 0.028627470550556054\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5852090032154341,\n\
\ \"acc_stderr\": 0.027982680459759563,\n \"acc_norm\": 0.5852090032154341,\n\
\ \"acc_norm_stderr\": 0.027982680459759563\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5061728395061729,\n \"acc_stderr\": 0.027818623962583302,\n\
\ \"acc_norm\": 0.5061728395061729,\n \"acc_norm_stderr\": 0.027818623962583302\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.36879432624113473,\n \"acc_stderr\": 0.02878222756134724,\n \
\ \"acc_norm\": 0.36879432624113473,\n \"acc_norm_stderr\": 0.02878222756134724\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3428943937418514,\n\
\ \"acc_stderr\": 0.012123463271585892,\n \"acc_norm\": 0.3428943937418514,\n\
\ \"acc_norm_stderr\": 0.012123463271585892\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5257352941176471,\n \"acc_stderr\": 0.030332578094555033,\n\
\ \"acc_norm\": 0.5257352941176471,\n \"acc_norm_stderr\": 0.030332578094555033\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4624183006535948,\n \"acc_stderr\": 0.020170614974969775,\n \
\ \"acc_norm\": 0.4624183006535948,\n \"acc_norm_stderr\": 0.020170614974969775\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4909090909090909,\n\
\ \"acc_stderr\": 0.04788339768702861,\n \"acc_norm\": 0.4909090909090909,\n\
\ \"acc_norm_stderr\": 0.04788339768702861\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4897959183673469,\n \"acc_stderr\": 0.03200255347893783,\n\
\ \"acc_norm\": 0.4897959183673469,\n \"acc_norm_stderr\": 0.03200255347893783\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6019900497512438,\n\
\ \"acc_stderr\": 0.03461199429040013,\n \"acc_norm\": 0.6019900497512438,\n\
\ \"acc_norm_stderr\": 0.03461199429040013\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.37349397590361444,\n\
\ \"acc_stderr\": 0.03765845117168862,\n \"acc_norm\": 0.37349397590361444,\n\
\ \"acc_norm_stderr\": 0.03765845117168862\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6900584795321637,\n \"acc_stderr\": 0.03546976959393161,\n\
\ \"acc_norm\": 0.6900584795321637,\n \"acc_norm_stderr\": 0.03546976959393161\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3243574051407589,\n\
\ \"mc1_stderr\": 0.016387976779647942,\n \"mc2\": 0.48497124472049463,\n\
\ \"mc2_stderr\": 0.014868370687930874\n }\n}\n```"
repo_url: https://huggingface.co/dfurman/llama-2-7b-instruct-peft
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_03T14_29_36.510142
path:
- '**/details_harness|arc:challenge|25_2023-10-03T14-29-36.510142.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-03T14-29-36.510142.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_03T14_29_36.510142
path:
- '**/details_harness|hellaswag|10_2023-10-03T14-29-36.510142.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-03T14-29-36.510142.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_03T14_29_36.510142
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T14-29-36.510142.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T14-29-36.510142.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T14-29-36.510142.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_03T14_29_36.510142
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T14-29-36.510142.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T14-29-36.510142.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_03T14_29_36.510142
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T14-29-36.510142.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T14-29-36.510142.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_03T14_29_36.510142
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T14-29-36.510142.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T14-29-36.510142.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_03T14_29_36.510142
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T14-29-36.510142.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T14-29-36.510142.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_03T14_29_36.510142
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T14-29-36.510142.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T14-29-36.510142.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_03T14_29_36.510142
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T14-29-36.510142.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T14-29-36.510142.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_03T14_29_36.510142
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T14-29-36.510142.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T14-29-36.510142.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_03T14_29_36.510142
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T14-29-36.510142.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T14-29-36.510142.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_03T14_29_36.510142
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T14-29-36.510142.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T14-29-36.510142.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_03T14_29_36.510142
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T14-29-36.510142.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T14-29-36.510142.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_03T14_29_36.510142
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T14-29-36.510142.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T14-29-36.510142.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_03T14_29_36.510142
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T14-29-36.510142.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T14-29-36.510142.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_03T14_29_36.510142
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T14-29-36.510142.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T14-29-36.510142.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_03T14_29_36.510142
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T14-29-36.510142.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T14-29-36.510142.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_03T14_29_36.510142
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T14-29-36.510142.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T14-29-36.510142.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_03T14_29_36.510142
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T14-29-36.510142.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T14-29-36.510142.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_03T14_29_36.510142
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T14-29-36.510142.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T14-29-36.510142.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_03T14_29_36.510142
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T14-29-36.510142.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T14-29-36.510142.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_03T14_29_36.510142
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T14-29-36.510142.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T14-29-36.510142.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_03T14_29_36.510142
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T14-29-36.510142.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T14-29-36.510142.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_03T14_29_36.510142
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T14-29-36.510142.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T14-29-36.510142.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_03T14_29_36.510142
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T14-29-36.510142.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T14-29-36.510142.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_03T14_29_36.510142
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T14-29-36.510142.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T14-29-36.510142.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_03T14_29_36.510142
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T14-29-36.510142.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T14-29-36.510142.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_03T14_29_36.510142
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T14-29-36.510142.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T14-29-36.510142.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_03T14_29_36.510142
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T14-29-36.510142.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T14-29-36.510142.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_03T14_29_36.510142
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T14-29-36.510142.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T14-29-36.510142.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_03T14_29_36.510142
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T14-29-36.510142.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T14-29-36.510142.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_03T14_29_36.510142
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T14-29-36.510142.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T14-29-36.510142.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_03T14_29_36.510142
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T14-29-36.510142.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T14-29-36.510142.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_03T14_29_36.510142
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T14-29-36.510142.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T14-29-36.510142.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_03T14_29_36.510142
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T14-29-36.510142.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T14-29-36.510142.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_03T14_29_36.510142
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T14-29-36.510142.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T14-29-36.510142.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_03T14_29_36.510142
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T14-29-36.510142.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T14-29-36.510142.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_03T14_29_36.510142
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T14-29-36.510142.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T14-29-36.510142.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_03T14_29_36.510142
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T14-29-36.510142.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T14-29-36.510142.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_03T14_29_36.510142
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T14-29-36.510142.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T14-29-36.510142.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_03T14_29_36.510142
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T14-29-36.510142.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T14-29-36.510142.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_03T14_29_36.510142
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T14-29-36.510142.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T14-29-36.510142.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_03T14_29_36.510142
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T14-29-36.510142.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T14-29-36.510142.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_03T14_29_36.510142
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T14-29-36.510142.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T14-29-36.510142.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_03T14_29_36.510142
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T14-29-36.510142.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T14-29-36.510142.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_03T14_29_36.510142
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T14-29-36.510142.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T14-29-36.510142.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_03T14_29_36.510142
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T14-29-36.510142.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T14-29-36.510142.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_03T14_29_36.510142
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T14-29-36.510142.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T14-29-36.510142.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_03T14_29_36.510142
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T14-29-36.510142.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T14-29-36.510142.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_03T14_29_36.510142
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T14-29-36.510142.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T14-29-36.510142.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_03T14_29_36.510142
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T14-29-36.510142.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T14-29-36.510142.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_03T14_29_36.510142
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T14-29-36.510142.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T14-29-36.510142.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_03T14_29_36.510142
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T14-29-36.510142.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T14-29-36.510142.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_03T14_29_36.510142
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T14-29-36.510142.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T14-29-36.510142.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_03T14_29_36.510142
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T14-29-36.510142.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T14-29-36.510142.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_03T14_29_36.510142
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T14-29-36.510142.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T14-29-36.510142.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_03T14_29_36.510142
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T14-29-36.510142.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T14-29-36.510142.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_03T14_29_36.510142
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T14-29-36.510142.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T14-29-36.510142.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_03T14_29_36.510142
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T14-29-36.510142.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T14-29-36.510142.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_03T14_29_36.510142
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T14-29-36.510142.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T14-29-36.510142.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_03T14_29_36.510142
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T14-29-36.510142.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T14-29-36.510142.parquet'
- config_name: results
data_files:
- split: 2023_10_03T14_29_36.510142
path:
- results_2023-10-03T14-29-36.510142.parquet
- split: latest
path:
- results_2023-10-03T14-29-36.510142.parquet
---
# Dataset Card for Evaluation run of dfurman/llama-2-7b-instruct-peft
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/dfurman/llama-2-7b-instruct-peft
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [dfurman/llama-2-7b-instruct-peft](https://huggingface.co/dfurman/llama-2-7b-instruct-peft) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_dfurman__llama-2-7b-instruct-peft",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-03T14:29:36.510142](https://huggingface.co/datasets/open-llm-leaderboard/details_dfurman__llama-2-7b-instruct-peft/blob/main/results_2023-10-03T14-29-36.510142.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4685145105933214,
"acc_stderr": 0.03535588922202098,
"acc_norm": 0.47253998227829414,
"acc_norm_stderr": 0.03534192029601582,
"mc1": 0.3243574051407589,
"mc1_stderr": 0.016387976779647942,
"mc2": 0.48497124472049463,
"mc2_stderr": 0.014868370687930874
},
"harness|arc:challenge|25": {
"acc": 0.47440273037542663,
"acc_stderr": 0.014592230885298964,
"acc_norm": 0.5119453924914675,
"acc_norm_stderr": 0.014607220340597167
},
"harness|hellaswag|10": {
"acc": 0.5892252539334794,
"acc_stderr": 0.004909689876342043,
"acc_norm": 0.7891854212308305,
"acc_norm_stderr": 0.004070533786739672
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5037037037037037,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.5037037037037037,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4868421052631579,
"acc_stderr": 0.04067533136309173,
"acc_norm": 0.4868421052631579,
"acc_norm_stderr": 0.04067533136309173
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4867924528301887,
"acc_stderr": 0.030762134874500482,
"acc_norm": 0.4867924528301887,
"acc_norm_stderr": 0.030762134874500482
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4305555555555556,
"acc_stderr": 0.04140685639111503,
"acc_norm": 0.4305555555555556,
"acc_norm_stderr": 0.04140685639111503
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.45664739884393063,
"acc_stderr": 0.03798106566014499,
"acc_norm": 0.45664739884393063,
"acc_norm_stderr": 0.03798106566014499
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.18627450980392157,
"acc_stderr": 0.038739587141493524,
"acc_norm": 0.18627450980392157,
"acc_norm_stderr": 0.038739587141493524
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.58,
"acc_stderr": 0.04960449637488583,
"acc_norm": 0.58,
"acc_norm_stderr": 0.04960449637488583
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4340425531914894,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.4340425531914894,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.043727482902780064,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.043727482902780064
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4413793103448276,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.4413793103448276,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.31216931216931215,
"acc_stderr": 0.023865206836972602,
"acc_norm": 0.31216931216931215,
"acc_norm_stderr": 0.023865206836972602
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.24603174603174602,
"acc_stderr": 0.03852273364924315,
"acc_norm": 0.24603174603174602,
"acc_norm_stderr": 0.03852273364924315
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5,
"acc_stderr": 0.028444006199428714,
"acc_norm": 0.5,
"acc_norm_stderr": 0.028444006199428714
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.35960591133004927,
"acc_stderr": 0.033764582465095665,
"acc_norm": 0.35960591133004927,
"acc_norm_stderr": 0.033764582465095665
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6121212121212121,
"acc_stderr": 0.03804913653971011,
"acc_norm": 0.6121212121212121,
"acc_norm_stderr": 0.03804913653971011
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5353535353535354,
"acc_stderr": 0.035534363688280626,
"acc_norm": 0.5353535353535354,
"acc_norm_stderr": 0.035534363688280626
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6632124352331606,
"acc_stderr": 0.03410780251836184,
"acc_norm": 0.6632124352331606,
"acc_norm_stderr": 0.03410780251836184
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4256410256410256,
"acc_stderr": 0.02506909438729654,
"acc_norm": 0.4256410256410256,
"acc_norm_stderr": 0.02506909438729654
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.027309140588230193,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.027309140588230193
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.44537815126050423,
"acc_stderr": 0.0322841062671639,
"acc_norm": 0.44537815126050423,
"acc_norm_stderr": 0.0322841062671639
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6165137614678899,
"acc_stderr": 0.02084715664191598,
"acc_norm": 0.6165137614678899,
"acc_norm_stderr": 0.02084715664191598
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3611111111111111,
"acc_stderr": 0.03275773486100998,
"acc_norm": 0.3611111111111111,
"acc_norm_stderr": 0.03275773486100998
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.0341078533890472,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.0341078533890472
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6455696202531646,
"acc_stderr": 0.031137304297185805,
"acc_norm": 0.6455696202531646,
"acc_norm_stderr": 0.031137304297185805
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5515695067264574,
"acc_stderr": 0.03337883736255098,
"acc_norm": 0.5515695067264574,
"acc_norm_stderr": 0.03337883736255098
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.46564885496183206,
"acc_stderr": 0.04374928560599738,
"acc_norm": 0.46564885496183206,
"acc_norm_stderr": 0.04374928560599738
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6528925619834711,
"acc_stderr": 0.04345724570292534,
"acc_norm": 0.6528925619834711,
"acc_norm_stderr": 0.04345724570292534
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.048262172941398944,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.048262172941398944
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.43558282208588955,
"acc_stderr": 0.038956324641389366,
"acc_norm": 0.43558282208588955,
"acc_norm_stderr": 0.038956324641389366
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.33035714285714285,
"acc_stderr": 0.04464285714285713,
"acc_norm": 0.33035714285714285,
"acc_norm_stderr": 0.04464285714285713
},
"harness|hendrycksTest-management|5": {
"acc": 0.5631067961165048,
"acc_stderr": 0.04911147107365777,
"acc_norm": 0.5631067961165048,
"acc_norm_stderr": 0.04911147107365777
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6752136752136753,
"acc_stderr": 0.03067902276549883,
"acc_norm": 0.6752136752136753,
"acc_norm_stderr": 0.03067902276549883
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6526181353767561,
"acc_stderr": 0.01702667174865573,
"acc_norm": 0.6526181353767561,
"acc_norm_stderr": 0.01702667174865573
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5202312138728323,
"acc_stderr": 0.026897049996382875,
"acc_norm": 0.5202312138728323,
"acc_norm_stderr": 0.026897049996382875
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27150837988826815,
"acc_stderr": 0.014874252168095256,
"acc_norm": 0.27150837988826815,
"acc_norm_stderr": 0.014874252168095256
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4934640522875817,
"acc_stderr": 0.028627470550556054,
"acc_norm": 0.4934640522875817,
"acc_norm_stderr": 0.028627470550556054
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5852090032154341,
"acc_stderr": 0.027982680459759563,
"acc_norm": 0.5852090032154341,
"acc_norm_stderr": 0.027982680459759563
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5061728395061729,
"acc_stderr": 0.027818623962583302,
"acc_norm": 0.5061728395061729,
"acc_norm_stderr": 0.027818623962583302
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.36879432624113473,
"acc_stderr": 0.02878222756134724,
"acc_norm": 0.36879432624113473,
"acc_norm_stderr": 0.02878222756134724
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3428943937418514,
"acc_stderr": 0.012123463271585892,
"acc_norm": 0.3428943937418514,
"acc_norm_stderr": 0.012123463271585892
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5257352941176471,
"acc_stderr": 0.030332578094555033,
"acc_norm": 0.5257352941176471,
"acc_norm_stderr": 0.030332578094555033
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4624183006535948,
"acc_stderr": 0.020170614974969775,
"acc_norm": 0.4624183006535948,
"acc_norm_stderr": 0.020170614974969775
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.4909090909090909,
"acc_stderr": 0.04788339768702861,
"acc_norm": 0.4909090909090909,
"acc_norm_stderr": 0.04788339768702861
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4897959183673469,
"acc_stderr": 0.03200255347893783,
"acc_norm": 0.4897959183673469,
"acc_norm_stderr": 0.03200255347893783
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6019900497512438,
"acc_stderr": 0.03461199429040013,
"acc_norm": 0.6019900497512438,
"acc_norm_stderr": 0.03461199429040013
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-virology|5": {
"acc": 0.37349397590361444,
"acc_stderr": 0.03765845117168862,
"acc_norm": 0.37349397590361444,
"acc_norm_stderr": 0.03765845117168862
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6900584795321637,
"acc_stderr": 0.03546976959393161,
"acc_norm": 0.6900584795321637,
"acc_norm_stderr": 0.03546976959393161
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3243574051407589,
"mc1_stderr": 0.016387976779647942,
"mc2": 0.48497124472049463,
"mc2_stderr": 0.014868370687930874
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
feynman-integrals-nn/box | 2023-10-05T23:34:30.000Z | [
"license:cc-by-4.0",
"region:us"
] | feynman-integrals-nn | null | null | null | 0 | 0 | ---
license: cc-by-4.0
---
# box
* [data](https://huggingface.co/datasets/feynman-integrals-nn/box)
* [model](https://huggingface.co/feynman-integrals-nn/box)
* [source](https://gitlab.com/feynman-integrals-nn/feynman-integrals-nn/-/tree/main/box)
|
TrainingDataPro/people-with-guns-segmentation-and-detection | 2023-10-03T14:51:42.000Z | [
"task_categories:image-segmentation",
"task_categories:object-detection",
"language:en",
"license:cc-by-nc-nd-4.0",
"code",
"finance",
"legal",
"region:us"
] | TrainingDataPro | null | null | null | 1 | 0 | ---
license: cc-by-nc-nd-4.0
task_categories:
- image-segmentation
- object-detection
language:
- en
tags:
- code
- finance
- legal
---
# People with Guns Segmentation & Detection Dataset
The dataset consists of photos depicting **individuals holding guns**. It specifically focuses on the **segmentation** of guns within these images and the **detection** of people holding guns.
Each image in the dataset presents a different scenario, capturing individuals from various *backgrounds, genders, and age groups in different poses* while holding guns.
The dataset is an essential resource for the development and evaluation of computer vision models and algorithms in fields related to *firearms recognition, security systems, law enforcement, and safety analysis*.

# Get the dataset
### This is just an example of the data
Leave a request on [**https://trainingdata.pro/data-market**](https://trainingdata.pro/data-market?utm_source=huggingface&utm_medium=cpc&utm_campaign=people-with-guns-segmentation-and-detection) to discuss your requirements, learn about the price and buy the dataset.
# Dataset structure
- **images** - contains of original images with people holding guns
- **labels** - includes visualized labeling created for the original images
- **annotations.xml** - contains coordinates of the polygons and bounding boxes, created for the original photo
# Data Format
Each image from `images` folder is accompanied by an XML-annotation in the `annotations.xml` file indicating the coordinates of the bounding boxes and polygons. For each point, the x and y coordinates are provided.
### Сlasses:
- **person**: person, who holds the gun, detected with a bounding box,
- **gun**: gun, labeled with a polygon
# Example of XML file structure

# People with Guns Segmentation & Detection might be made in accordance with your requirements.
## **[TrainingData](https://trainingdata.pro/data-market?utm_source=huggingface&utm_medium=cpc&utm_campaign=people-with-guns-segmentation-and-detection)** provides high-quality data annotation tailored to your needs
More datasets in TrainingData's Kaggle account: **https://www.kaggle.com/trainingdatapro/datasets**
TrainingData's GitHub: **https://github.com/trainingdata-pro** |
open-llm-leaderboard/details_PulsarAI__MythoMax-L2-LoRA-Assemble-13B | 2023-10-03T14:59:27.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of PulsarAI/MythoMax-L2-LoRA-Assemble-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [PulsarAI/MythoMax-L2-LoRA-Assemble-13B](https://huggingface.co/PulsarAI/MythoMax-L2-LoRA-Assemble-13B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PulsarAI__MythoMax-L2-LoRA-Assemble-13B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-03T14:58:01.778055](https://huggingface.co/datasets/open-llm-leaderboard/details_PulsarAI__MythoMax-L2-LoRA-Assemble-13B/blob/main/results_2023-10-03T14-58-01.778055.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.598938175511998,\n\
\ \"acc_stderr\": 0.03385413189247629,\n \"acc_norm\": 0.6028583107012461,\n\
\ \"acc_norm_stderr\": 0.03383158640553202,\n \"mc1\": 0.40514075887392903,\n\
\ \"mc1_stderr\": 0.01718561172775337,\n \"mc2\": 0.5594181501740189,\n\
\ \"mc2_stderr\": 0.015699414732693026\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6040955631399317,\n \"acc_stderr\": 0.014291228393536587,\n\
\ \"acc_norm\": 0.636518771331058,\n \"acc_norm_stderr\": 0.014056207319068283\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6358295160326628,\n\
\ \"acc_stderr\": 0.004802133511654241,\n \"acc_norm\": 0.8346942840071699,\n\
\ \"acc_norm_stderr\": 0.003706970856410953\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5185185185185185,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.5185185185185185,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.618421052631579,\n \"acc_stderr\": 0.03953173377749194,\n\
\ \"acc_norm\": 0.618421052631579,\n \"acc_norm_stderr\": 0.03953173377749194\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6188679245283019,\n \"acc_stderr\": 0.029890609686286637,\n\
\ \"acc_norm\": 0.6188679245283019,\n \"acc_norm_stderr\": 0.029890609686286637\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6597222222222222,\n\
\ \"acc_stderr\": 0.039621355734862175,\n \"acc_norm\": 0.6597222222222222,\n\
\ \"acc_norm_stderr\": 0.039621355734862175\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6011560693641619,\n\
\ \"acc_stderr\": 0.037336266553835096,\n \"acc_norm\": 0.6011560693641619,\n\
\ \"acc_norm_stderr\": 0.037336266553835096\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201942,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201942\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4978723404255319,\n \"acc_stderr\": 0.03268572658667492,\n\
\ \"acc_norm\": 0.4978723404255319,\n \"acc_norm_stderr\": 0.03268572658667492\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.35964912280701755,\n\
\ \"acc_stderr\": 0.04514496132873634,\n \"acc_norm\": 0.35964912280701755,\n\
\ \"acc_norm_stderr\": 0.04514496132873634\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370333,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370333\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3492063492063492,\n \"acc_stderr\": 0.024552292209342658,\n \"\
acc_norm\": 0.3492063492063492,\n \"acc_norm_stderr\": 0.024552292209342658\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n\
\ \"acc_stderr\": 0.043062412591271526,\n \"acc_norm\": 0.36507936507936506,\n\
\ \"acc_norm_stderr\": 0.043062412591271526\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.667741935483871,\n\
\ \"acc_stderr\": 0.0267955608481228,\n \"acc_norm\": 0.667741935483871,\n\
\ \"acc_norm_stderr\": 0.0267955608481228\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\"\
: 0.62,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.0347769116216366,\n\
\ \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.0347769116216366\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7777777777777778,\n \"acc_stderr\": 0.029620227874790482,\n \"\
acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.029620227874790482\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.02338193534812143,\n\
\ \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.02338193534812143\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6205128205128205,\n \"acc_stderr\": 0.02460362692409742,\n \
\ \"acc_norm\": 0.6205128205128205,\n \"acc_norm_stderr\": 0.02460362692409742\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028597,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028597\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5966386554621849,\n \"acc_stderr\": 0.031866081214088314,\n\
\ \"acc_norm\": 0.5966386554621849,\n \"acc_norm_stderr\": 0.031866081214088314\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.03757949922943342,\n \"acc_norm\"\
: 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943342\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7944954128440367,\n\
\ \"acc_stderr\": 0.01732435232501602,\n \"acc_norm\": 0.7944954128440367,\n\
\ \"acc_norm_stderr\": 0.01732435232501602\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.42592592592592593,\n \"acc_stderr\": 0.033723432716530645,\n\
\ \"acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.033723432716530645\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8333333333333334,\n \"acc_stderr\": 0.02615686752393104,\n \"\
acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.02615686752393104\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7848101265822784,\n \"acc_stderr\": 0.02675082699467617,\n \
\ \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.02675082699467617\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.03114679648297246,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.03114679648297246\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6793893129770993,\n \"acc_stderr\": 0.04093329229834278,\n\
\ \"acc_norm\": 0.6793893129770993,\n \"acc_norm_stderr\": 0.04093329229834278\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7272727272727273,\n \"acc_stderr\": 0.04065578140908706,\n \"\
acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04065578140908706\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6993865030674846,\n \"acc_stderr\": 0.03602511318806771,\n\
\ \"acc_norm\": 0.6993865030674846,\n \"acc_norm_stderr\": 0.03602511318806771\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4017857142857143,\n\
\ \"acc_stderr\": 0.04653333146973646,\n \"acc_norm\": 0.4017857142857143,\n\
\ \"acc_norm_stderr\": 0.04653333146973646\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8461538461538461,\n\
\ \"acc_stderr\": 0.023636873317489294,\n \"acc_norm\": 0.8461538461538461,\n\
\ \"acc_norm_stderr\": 0.023636873317489294\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7956577266922095,\n\
\ \"acc_stderr\": 0.0144191239809319,\n \"acc_norm\": 0.7956577266922095,\n\
\ \"acc_norm_stderr\": 0.0144191239809319\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.661849710982659,\n \"acc_stderr\": 0.025469770149400172,\n\
\ \"acc_norm\": 0.661849710982659,\n \"acc_norm_stderr\": 0.025469770149400172\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.48044692737430167,\n\
\ \"acc_stderr\": 0.016709709877661995,\n \"acc_norm\": 0.48044692737430167,\n\
\ \"acc_norm_stderr\": 0.016709709877661995\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6699346405228758,\n \"acc_stderr\": 0.026925654653615693,\n\
\ \"acc_norm\": 0.6699346405228758,\n \"acc_norm_stderr\": 0.026925654653615693\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6881028938906752,\n\
\ \"acc_stderr\": 0.026311858071854155,\n \"acc_norm\": 0.6881028938906752,\n\
\ \"acc_norm_stderr\": 0.026311858071854155\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7067901234567902,\n \"acc_stderr\": 0.025329888171900922,\n\
\ \"acc_norm\": 0.7067901234567902,\n \"acc_norm_stderr\": 0.025329888171900922\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873866,\n \
\ \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873866\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46088657105606257,\n\
\ \"acc_stderr\": 0.012731102790504526,\n \"acc_norm\": 0.46088657105606257,\n\
\ \"acc_norm_stderr\": 0.012731102790504526\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6102941176470589,\n \"acc_stderr\": 0.0296246635811597,\n\
\ \"acc_norm\": 0.6102941176470589,\n \"acc_norm_stderr\": 0.0296246635811597\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5898692810457516,\n \"acc_stderr\": 0.019898412717635903,\n \
\ \"acc_norm\": 0.5898692810457516,\n \"acc_norm_stderr\": 0.019898412717635903\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6775510204081633,\n \"acc_stderr\": 0.02992310056368391,\n\
\ \"acc_norm\": 0.6775510204081633,\n \"acc_norm_stderr\": 0.02992310056368391\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7611940298507462,\n\
\ \"acc_stderr\": 0.03014777593540922,\n \"acc_norm\": 0.7611940298507462,\n\
\ \"acc_norm_stderr\": 0.03014777593540922\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4879518072289157,\n\
\ \"acc_stderr\": 0.03891364495835821,\n \"acc_norm\": 0.4879518072289157,\n\
\ \"acc_norm_stderr\": 0.03891364495835821\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.030611116557432528,\n\
\ \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.030611116557432528\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.40514075887392903,\n\
\ \"mc1_stderr\": 0.01718561172775337,\n \"mc2\": 0.5594181501740189,\n\
\ \"mc2_stderr\": 0.015699414732693026\n }\n}\n```"
repo_url: https://huggingface.co/PulsarAI/MythoMax-L2-LoRA-Assemble-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_03T14_58_01.778055
path:
- '**/details_harness|arc:challenge|25_2023-10-03T14-58-01.778055.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-03T14-58-01.778055.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_03T14_58_01.778055
path:
- '**/details_harness|hellaswag|10_2023-10-03T14-58-01.778055.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-03T14-58-01.778055.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_03T14_58_01.778055
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T14-58-01.778055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T14-58-01.778055.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T14-58-01.778055.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_03T14_58_01.778055
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T14-58-01.778055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T14-58-01.778055.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_03T14_58_01.778055
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T14-58-01.778055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T14-58-01.778055.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_03T14_58_01.778055
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T14-58-01.778055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T14-58-01.778055.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_03T14_58_01.778055
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T14-58-01.778055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T14-58-01.778055.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_03T14_58_01.778055
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T14-58-01.778055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T14-58-01.778055.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_03T14_58_01.778055
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T14-58-01.778055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T14-58-01.778055.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_03T14_58_01.778055
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T14-58-01.778055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T14-58-01.778055.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_03T14_58_01.778055
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T14-58-01.778055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T14-58-01.778055.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_03T14_58_01.778055
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T14-58-01.778055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T14-58-01.778055.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_03T14_58_01.778055
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T14-58-01.778055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T14-58-01.778055.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_03T14_58_01.778055
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T14-58-01.778055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T14-58-01.778055.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_03T14_58_01.778055
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T14-58-01.778055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T14-58-01.778055.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_03T14_58_01.778055
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T14-58-01.778055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T14-58-01.778055.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_03T14_58_01.778055
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T14-58-01.778055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T14-58-01.778055.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_03T14_58_01.778055
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T14-58-01.778055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T14-58-01.778055.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_03T14_58_01.778055
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T14-58-01.778055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T14-58-01.778055.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_03T14_58_01.778055
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T14-58-01.778055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T14-58-01.778055.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_03T14_58_01.778055
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T14-58-01.778055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T14-58-01.778055.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_03T14_58_01.778055
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T14-58-01.778055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T14-58-01.778055.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_03T14_58_01.778055
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T14-58-01.778055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T14-58-01.778055.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_03T14_58_01.778055
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T14-58-01.778055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T14-58-01.778055.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_03T14_58_01.778055
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T14-58-01.778055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T14-58-01.778055.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_03T14_58_01.778055
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T14-58-01.778055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T14-58-01.778055.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_03T14_58_01.778055
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T14-58-01.778055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T14-58-01.778055.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_03T14_58_01.778055
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T14-58-01.778055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T14-58-01.778055.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_03T14_58_01.778055
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T14-58-01.778055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T14-58-01.778055.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_03T14_58_01.778055
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T14-58-01.778055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T14-58-01.778055.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_03T14_58_01.778055
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T14-58-01.778055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T14-58-01.778055.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_03T14_58_01.778055
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T14-58-01.778055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T14-58-01.778055.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_03T14_58_01.778055
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T14-58-01.778055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T14-58-01.778055.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_03T14_58_01.778055
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T14-58-01.778055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T14-58-01.778055.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_03T14_58_01.778055
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T14-58-01.778055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T14-58-01.778055.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_03T14_58_01.778055
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T14-58-01.778055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T14-58-01.778055.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_03T14_58_01.778055
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T14-58-01.778055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T14-58-01.778055.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_03T14_58_01.778055
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T14-58-01.778055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T14-58-01.778055.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_03T14_58_01.778055
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T14-58-01.778055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T14-58-01.778055.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_03T14_58_01.778055
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T14-58-01.778055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T14-58-01.778055.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_03T14_58_01.778055
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T14-58-01.778055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T14-58-01.778055.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_03T14_58_01.778055
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T14-58-01.778055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T14-58-01.778055.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_03T14_58_01.778055
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T14-58-01.778055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T14-58-01.778055.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_03T14_58_01.778055
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T14-58-01.778055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T14-58-01.778055.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_03T14_58_01.778055
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T14-58-01.778055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T14-58-01.778055.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_03T14_58_01.778055
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T14-58-01.778055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T14-58-01.778055.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_03T14_58_01.778055
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T14-58-01.778055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T14-58-01.778055.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_03T14_58_01.778055
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T14-58-01.778055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T14-58-01.778055.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_03T14_58_01.778055
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T14-58-01.778055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T14-58-01.778055.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_03T14_58_01.778055
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T14-58-01.778055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T14-58-01.778055.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_03T14_58_01.778055
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T14-58-01.778055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T14-58-01.778055.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_03T14_58_01.778055
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T14-58-01.778055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T14-58-01.778055.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_03T14_58_01.778055
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T14-58-01.778055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T14-58-01.778055.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_03T14_58_01.778055
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T14-58-01.778055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T14-58-01.778055.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_03T14_58_01.778055
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T14-58-01.778055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T14-58-01.778055.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_03T14_58_01.778055
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T14-58-01.778055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T14-58-01.778055.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_03T14_58_01.778055
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T14-58-01.778055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T14-58-01.778055.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_03T14_58_01.778055
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T14-58-01.778055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T14-58-01.778055.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_03T14_58_01.778055
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T14-58-01.778055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T14-58-01.778055.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_03T14_58_01.778055
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T14-58-01.778055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T14-58-01.778055.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_03T14_58_01.778055
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T14-58-01.778055.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T14-58-01.778055.parquet'
- config_name: results
data_files:
- split: 2023_10_03T14_58_01.778055
path:
- results_2023-10-03T14-58-01.778055.parquet
- split: latest
path:
- results_2023-10-03T14-58-01.778055.parquet
---
# Dataset Card for Evaluation run of PulsarAI/MythoMax-L2-LoRA-Assemble-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/PulsarAI/MythoMax-L2-LoRA-Assemble-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [PulsarAI/MythoMax-L2-LoRA-Assemble-13B](https://huggingface.co/PulsarAI/MythoMax-L2-LoRA-Assemble-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_PulsarAI__MythoMax-L2-LoRA-Assemble-13B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-03T14:58:01.778055](https://huggingface.co/datasets/open-llm-leaderboard/details_PulsarAI__MythoMax-L2-LoRA-Assemble-13B/blob/main/results_2023-10-03T14-58-01.778055.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.598938175511998,
"acc_stderr": 0.03385413189247629,
"acc_norm": 0.6028583107012461,
"acc_norm_stderr": 0.03383158640553202,
"mc1": 0.40514075887392903,
"mc1_stderr": 0.01718561172775337,
"mc2": 0.5594181501740189,
"mc2_stderr": 0.015699414732693026
},
"harness|arc:challenge|25": {
"acc": 0.6040955631399317,
"acc_stderr": 0.014291228393536587,
"acc_norm": 0.636518771331058,
"acc_norm_stderr": 0.014056207319068283
},
"harness|hellaswag|10": {
"acc": 0.6358295160326628,
"acc_stderr": 0.004802133511654241,
"acc_norm": 0.8346942840071699,
"acc_norm_stderr": 0.003706970856410953
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.618421052631579,
"acc_stderr": 0.03953173377749194,
"acc_norm": 0.618421052631579,
"acc_norm_stderr": 0.03953173377749194
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6188679245283019,
"acc_stderr": 0.029890609686286637,
"acc_norm": 0.6188679245283019,
"acc_norm_stderr": 0.029890609686286637
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6597222222222222,
"acc_stderr": 0.039621355734862175,
"acc_norm": 0.6597222222222222,
"acc_norm_stderr": 0.039621355734862175
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6011560693641619,
"acc_stderr": 0.037336266553835096,
"acc_norm": 0.6011560693641619,
"acc_norm_stderr": 0.037336266553835096
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04690650298201942,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04690650298201942
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4978723404255319,
"acc_stderr": 0.03268572658667492,
"acc_norm": 0.4978723404255319,
"acc_norm_stderr": 0.03268572658667492
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.35964912280701755,
"acc_stderr": 0.04514496132873634,
"acc_norm": 0.35964912280701755,
"acc_norm_stderr": 0.04514496132873634
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370333,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370333
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3492063492063492,
"acc_stderr": 0.024552292209342658,
"acc_norm": 0.3492063492063492,
"acc_norm_stderr": 0.024552292209342658
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.043062412591271526,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.043062412591271526
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.667741935483871,
"acc_stderr": 0.0267955608481228,
"acc_norm": 0.667741935483871,
"acc_norm_stderr": 0.0267955608481228
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.0347769116216366,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.0347769116216366
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.029620227874790482,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.029620227874790482
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.02338193534812143,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.02338193534812143
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6205128205128205,
"acc_stderr": 0.02460362692409742,
"acc_norm": 0.6205128205128205,
"acc_norm_stderr": 0.02460362692409742
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028597,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028597
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5966386554621849,
"acc_stderr": 0.031866081214088314,
"acc_norm": 0.5966386554621849,
"acc_norm_stderr": 0.031866081214088314
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943342,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943342
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7944954128440367,
"acc_stderr": 0.01732435232501602,
"acc_norm": 0.7944954128440367,
"acc_norm_stderr": 0.01732435232501602
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.033723432716530645,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.033723432716530645
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.02615686752393104,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.02615686752393104
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.02675082699467617,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.02675082699467617
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.03114679648297246,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.03114679648297246
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6793893129770993,
"acc_stderr": 0.04093329229834278,
"acc_norm": 0.6793893129770993,
"acc_norm_stderr": 0.04093329229834278
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04065578140908706,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04065578140908706
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6993865030674846,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.6993865030674846,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4017857142857143,
"acc_stderr": 0.04653333146973646,
"acc_norm": 0.4017857142857143,
"acc_norm_stderr": 0.04653333146973646
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8461538461538461,
"acc_stderr": 0.023636873317489294,
"acc_norm": 0.8461538461538461,
"acc_norm_stderr": 0.023636873317489294
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7956577266922095,
"acc_stderr": 0.0144191239809319,
"acc_norm": 0.7956577266922095,
"acc_norm_stderr": 0.0144191239809319
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.661849710982659,
"acc_stderr": 0.025469770149400172,
"acc_norm": 0.661849710982659,
"acc_norm_stderr": 0.025469770149400172
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.48044692737430167,
"acc_stderr": 0.016709709877661995,
"acc_norm": 0.48044692737430167,
"acc_norm_stderr": 0.016709709877661995
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6699346405228758,
"acc_stderr": 0.026925654653615693,
"acc_norm": 0.6699346405228758,
"acc_norm_stderr": 0.026925654653615693
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6881028938906752,
"acc_stderr": 0.026311858071854155,
"acc_norm": 0.6881028938906752,
"acc_norm_stderr": 0.026311858071854155
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7067901234567902,
"acc_stderr": 0.025329888171900922,
"acc_norm": 0.7067901234567902,
"acc_norm_stderr": 0.025329888171900922
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.029766675075873866,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.029766675075873866
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46088657105606257,
"acc_stderr": 0.012731102790504526,
"acc_norm": 0.46088657105606257,
"acc_norm_stderr": 0.012731102790504526
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6102941176470589,
"acc_stderr": 0.0296246635811597,
"acc_norm": 0.6102941176470589,
"acc_norm_stderr": 0.0296246635811597
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5898692810457516,
"acc_stderr": 0.019898412717635903,
"acc_norm": 0.5898692810457516,
"acc_norm_stderr": 0.019898412717635903
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6775510204081633,
"acc_stderr": 0.02992310056368391,
"acc_norm": 0.6775510204081633,
"acc_norm_stderr": 0.02992310056368391
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7611940298507462,
"acc_stderr": 0.03014777593540922,
"acc_norm": 0.7611940298507462,
"acc_norm_stderr": 0.03014777593540922
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4879518072289157,
"acc_stderr": 0.03891364495835821,
"acc_norm": 0.4879518072289157,
"acc_norm_stderr": 0.03891364495835821
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.030611116557432528,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.030611116557432528
},
"harness|truthfulqa:mc|0": {
"mc1": 0.40514075887392903,
"mc1_stderr": 0.01718561172775337,
"mc2": 0.5594181501740189,
"mc2_stderr": 0.015699414732693026
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_NousResearch__Capybara-7B | 2023-10-03T15:13:15.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of NousResearch/Capybara-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [NousResearch/Capybara-7B](https://huggingface.co/NousResearch/Capybara-7B) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NousResearch__Capybara-7B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-03T15:11:52.026776](https://huggingface.co/datasets/open-llm-leaderboard/details_NousResearch__Capybara-7B/blob/main/results_2023-10-03T15-11-52.026776.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4906960835084916,\n\
\ \"acc_stderr\": 0.034810613237060924,\n \"acc_norm\": 0.49454145605365324,\n\
\ \"acc_norm_stderr\": 0.03479380901358992,\n \"mc1\": 0.34516523867809057,\n\
\ \"mc1_stderr\": 0.01664310331927494,\n \"mc2\": 0.5107046569316315,\n\
\ \"mc2_stderr\": 0.01581761558569416\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5170648464163823,\n \"acc_stderr\": 0.0146028783885366,\n\
\ \"acc_norm\": 0.5520477815699659,\n \"acc_norm_stderr\": 0.01453201149821168\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6157140011949811,\n\
\ \"acc_stderr\": 0.004854318994447738,\n \"acc_norm\": 0.8076080462059351,\n\
\ \"acc_norm_stderr\": 0.003933736699983618\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5111111111111111,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.5111111111111111,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4342105263157895,\n \"acc_stderr\": 0.0403356566784832,\n\
\ \"acc_norm\": 0.4342105263157895,\n \"acc_norm_stderr\": 0.0403356566784832\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.51,\n\
\ \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n \
\ \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5320754716981132,\n \"acc_stderr\": 0.030709486992556545,\n\
\ \"acc_norm\": 0.5320754716981132,\n \"acc_norm_stderr\": 0.030709486992556545\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4930555555555556,\n\
\ \"acc_stderr\": 0.041808067502949374,\n \"acc_norm\": 0.4930555555555556,\n\
\ \"acc_norm_stderr\": 0.041808067502949374\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768077,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768077\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n\
\ \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720685,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720685\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.44508670520231214,\n\
\ \"acc_stderr\": 0.03789401760283647,\n \"acc_norm\": 0.44508670520231214,\n\
\ \"acc_norm_stderr\": 0.03789401760283647\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.04389869956808778,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.04389869956808778\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n\
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.44680851063829785,\n \"acc_stderr\": 0.0325005368436584,\n\
\ \"acc_norm\": 0.44680851063829785,\n \"acc_norm_stderr\": 0.0325005368436584\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813344,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813344\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.32413793103448274,\n \"acc_stderr\": 0.03900432069185555,\n\
\ \"acc_norm\": 0.32413793103448274,\n \"acc_norm_stderr\": 0.03900432069185555\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30687830687830686,\n \"acc_stderr\": 0.023752928712112143,\n \"\
acc_norm\": 0.30687830687830686,\n \"acc_norm_stderr\": 0.023752928712112143\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.31746031746031744,\n\
\ \"acc_stderr\": 0.0416345303130286,\n \"acc_norm\": 0.31746031746031744,\n\
\ \"acc_norm_stderr\": 0.0416345303130286\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5612903225806452,\n\
\ \"acc_stderr\": 0.028229497320317213,\n \"acc_norm\": 0.5612903225806452,\n\
\ \"acc_norm_stderr\": 0.028229497320317213\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.35960591133004927,\n \"acc_stderr\": 0.03376458246509567,\n\
\ \"acc_norm\": 0.35960591133004927,\n \"acc_norm_stderr\": 0.03376458246509567\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\"\
: 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6606060606060606,\n \"acc_stderr\": 0.03697442205031596,\n\
\ \"acc_norm\": 0.6606060606060606,\n \"acc_norm_stderr\": 0.03697442205031596\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6666666666666666,\n \"acc_stderr\": 0.03358618145732522,\n \"\
acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03358618145732522\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7512953367875648,\n \"acc_stderr\": 0.0311958408777003,\n\
\ \"acc_norm\": 0.7512953367875648,\n \"acc_norm_stderr\": 0.0311958408777003\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.46153846153846156,\n \"acc_stderr\": 0.025275892070240637,\n\
\ \"acc_norm\": 0.46153846153846156,\n \"acc_norm_stderr\": 0.025275892070240637\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085622,\n \
\ \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085622\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.46638655462184875,\n \"acc_stderr\": 0.03240501447690071,\n\
\ \"acc_norm\": 0.46638655462184875,\n \"acc_norm_stderr\": 0.03240501447690071\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"\
acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6880733944954128,\n \"acc_stderr\": 0.019862967976707245,\n \"\
acc_norm\": 0.6880733944954128,\n \"acc_norm_stderr\": 0.019862967976707245\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.32407407407407407,\n \"acc_stderr\": 0.03191923445686185,\n \"\
acc_norm\": 0.32407407407407407,\n \"acc_norm_stderr\": 0.03191923445686185\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6715686274509803,\n \"acc_stderr\": 0.03296245110172229,\n \"\
acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.03296245110172229\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.70042194092827,\n \"acc_stderr\": 0.02981802474975309,\n \
\ \"acc_norm\": 0.70042194092827,\n \"acc_norm_stderr\": 0.02981802474975309\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6143497757847534,\n\
\ \"acc_stderr\": 0.03266842214289201,\n \"acc_norm\": 0.6143497757847534,\n\
\ \"acc_norm_stderr\": 0.03266842214289201\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.549618320610687,\n \"acc_stderr\": 0.04363643698524779,\n\
\ \"acc_norm\": 0.549618320610687,\n \"acc_norm_stderr\": 0.04363643698524779\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6446280991735537,\n \"acc_stderr\": 0.04369236326573981,\n \"\
acc_norm\": 0.6446280991735537,\n \"acc_norm_stderr\": 0.04369236326573981\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6111111111111112,\n\
\ \"acc_stderr\": 0.04712821257426769,\n \"acc_norm\": 0.6111111111111112,\n\
\ \"acc_norm_stderr\": 0.04712821257426769\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6196319018404908,\n \"acc_stderr\": 0.038142698932618374,\n\
\ \"acc_norm\": 0.6196319018404908,\n \"acc_norm_stderr\": 0.038142698932618374\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6116504854368932,\n \"acc_stderr\": 0.04825729337356389,\n\
\ \"acc_norm\": 0.6116504854368932,\n \"acc_norm_stderr\": 0.04825729337356389\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7435897435897436,\n\
\ \"acc_stderr\": 0.028605953702004257,\n \"acc_norm\": 0.7435897435897436,\n\
\ \"acc_norm_stderr\": 0.028605953702004257\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.58,\n \"acc_stderr\": 0.04960449637488583,\n \
\ \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.04960449637488583\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.698595146871009,\n\
\ \"acc_stderr\": 0.01640909109726878,\n \"acc_norm\": 0.698595146871009,\n\
\ \"acc_norm_stderr\": 0.01640909109726878\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5317919075144508,\n \"acc_stderr\": 0.026864624366756643,\n\
\ \"acc_norm\": 0.5317919075144508,\n \"acc_norm_stderr\": 0.026864624366756643\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23910614525139665,\n\
\ \"acc_stderr\": 0.014265554192331144,\n \"acc_norm\": 0.23910614525139665,\n\
\ \"acc_norm_stderr\": 0.014265554192331144\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.4738562091503268,\n \"acc_stderr\": 0.028590752958852394,\n\
\ \"acc_norm\": 0.4738562091503268,\n \"acc_norm_stderr\": 0.028590752958852394\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6012861736334405,\n\
\ \"acc_stderr\": 0.0278093225857745,\n \"acc_norm\": 0.6012861736334405,\n\
\ \"acc_norm_stderr\": 0.0278093225857745\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5740740740740741,\n \"acc_stderr\": 0.027513747284379424,\n\
\ \"acc_norm\": 0.5740740740740741,\n \"acc_norm_stderr\": 0.027513747284379424\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.36879432624113473,\n \"acc_stderr\": 0.02878222756134724,\n \
\ \"acc_norm\": 0.36879432624113473,\n \"acc_norm_stderr\": 0.02878222756134724\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.378748370273794,\n\
\ \"acc_stderr\": 0.012389052105003727,\n \"acc_norm\": 0.378748370273794,\n\
\ \"acc_norm_stderr\": 0.012389052105003727\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5183823529411765,\n \"acc_stderr\": 0.03035230339535197,\n\
\ \"acc_norm\": 0.5183823529411765,\n \"acc_norm_stderr\": 0.03035230339535197\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.477124183006536,\n \"acc_stderr\": 0.020206653187884786,\n \
\ \"acc_norm\": 0.477124183006536,\n \"acc_norm_stderr\": 0.020206653187884786\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5909090909090909,\n\
\ \"acc_stderr\": 0.04709306978661896,\n \"acc_norm\": 0.5909090909090909,\n\
\ \"acc_norm_stderr\": 0.04709306978661896\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.43673469387755104,\n \"acc_stderr\": 0.03175195237583322,\n\
\ \"acc_norm\": 0.43673469387755104,\n \"acc_norm_stderr\": 0.03175195237583322\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6865671641791045,\n\
\ \"acc_stderr\": 0.03280188205348643,\n \"acc_norm\": 0.6865671641791045,\n\
\ \"acc_norm_stderr\": 0.03280188205348643\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3855421686746988,\n\
\ \"acc_stderr\": 0.03789134424611551,\n \"acc_norm\": 0.3855421686746988,\n\
\ \"acc_norm_stderr\": 0.03789134424611551\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.695906432748538,\n \"acc_stderr\": 0.0352821125824523,\n\
\ \"acc_norm\": 0.695906432748538,\n \"acc_norm_stderr\": 0.0352821125824523\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.34516523867809057,\n\
\ \"mc1_stderr\": 0.01664310331927494,\n \"mc2\": 0.5107046569316315,\n\
\ \"mc2_stderr\": 0.01581761558569416\n }\n}\n```"
repo_url: https://huggingface.co/NousResearch/Capybara-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_03T15_11_52.026776
path:
- '**/details_harness|arc:challenge|25_2023-10-03T15-11-52.026776.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-03T15-11-52.026776.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_03T15_11_52.026776
path:
- '**/details_harness|hellaswag|10_2023-10-03T15-11-52.026776.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-03T15-11-52.026776.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_03T15_11_52.026776
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T15-11-52.026776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T15-11-52.026776.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T15-11-52.026776.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_03T15_11_52.026776
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T15-11-52.026776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T15-11-52.026776.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_03T15_11_52.026776
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T15-11-52.026776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T15-11-52.026776.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_03T15_11_52.026776
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T15-11-52.026776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T15-11-52.026776.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_03T15_11_52.026776
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T15-11-52.026776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T15-11-52.026776.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_03T15_11_52.026776
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T15-11-52.026776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T15-11-52.026776.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_03T15_11_52.026776
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T15-11-52.026776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T15-11-52.026776.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_03T15_11_52.026776
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T15-11-52.026776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T15-11-52.026776.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_03T15_11_52.026776
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T15-11-52.026776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T15-11-52.026776.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_03T15_11_52.026776
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T15-11-52.026776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T15-11-52.026776.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_03T15_11_52.026776
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T15-11-52.026776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T15-11-52.026776.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_03T15_11_52.026776
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T15-11-52.026776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T15-11-52.026776.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_03T15_11_52.026776
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T15-11-52.026776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T15-11-52.026776.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_03T15_11_52.026776
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T15-11-52.026776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T15-11-52.026776.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_03T15_11_52.026776
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T15-11-52.026776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T15-11-52.026776.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_03T15_11_52.026776
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T15-11-52.026776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T15-11-52.026776.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_03T15_11_52.026776
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T15-11-52.026776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T15-11-52.026776.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_03T15_11_52.026776
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T15-11-52.026776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T15-11-52.026776.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_03T15_11_52.026776
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T15-11-52.026776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T15-11-52.026776.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_03T15_11_52.026776
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T15-11-52.026776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T15-11-52.026776.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_03T15_11_52.026776
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T15-11-52.026776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T15-11-52.026776.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_03T15_11_52.026776
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T15-11-52.026776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T15-11-52.026776.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_03T15_11_52.026776
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T15-11-52.026776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T15-11-52.026776.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_03T15_11_52.026776
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T15-11-52.026776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T15-11-52.026776.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_03T15_11_52.026776
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T15-11-52.026776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T15-11-52.026776.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_03T15_11_52.026776
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T15-11-52.026776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T15-11-52.026776.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_03T15_11_52.026776
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T15-11-52.026776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T15-11-52.026776.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_03T15_11_52.026776
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T15-11-52.026776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T15-11-52.026776.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_03T15_11_52.026776
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T15-11-52.026776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T15-11-52.026776.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_03T15_11_52.026776
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T15-11-52.026776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T15-11-52.026776.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_03T15_11_52.026776
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T15-11-52.026776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T15-11-52.026776.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_03T15_11_52.026776
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T15-11-52.026776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T15-11-52.026776.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_03T15_11_52.026776
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T15-11-52.026776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T15-11-52.026776.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_03T15_11_52.026776
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T15-11-52.026776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T15-11-52.026776.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_03T15_11_52.026776
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T15-11-52.026776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T15-11-52.026776.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_03T15_11_52.026776
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T15-11-52.026776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T15-11-52.026776.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_03T15_11_52.026776
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T15-11-52.026776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T15-11-52.026776.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_03T15_11_52.026776
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T15-11-52.026776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T15-11-52.026776.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_03T15_11_52.026776
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T15-11-52.026776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T15-11-52.026776.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_03T15_11_52.026776
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T15-11-52.026776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T15-11-52.026776.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_03T15_11_52.026776
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T15-11-52.026776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T15-11-52.026776.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_03T15_11_52.026776
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T15-11-52.026776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T15-11-52.026776.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_03T15_11_52.026776
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T15-11-52.026776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T15-11-52.026776.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_03T15_11_52.026776
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T15-11-52.026776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T15-11-52.026776.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_03T15_11_52.026776
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T15-11-52.026776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T15-11-52.026776.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_03T15_11_52.026776
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T15-11-52.026776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T15-11-52.026776.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_03T15_11_52.026776
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T15-11-52.026776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T15-11-52.026776.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_03T15_11_52.026776
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T15-11-52.026776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T15-11-52.026776.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_03T15_11_52.026776
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T15-11-52.026776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T15-11-52.026776.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_03T15_11_52.026776
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T15-11-52.026776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T15-11-52.026776.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_03T15_11_52.026776
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T15-11-52.026776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T15-11-52.026776.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_03T15_11_52.026776
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T15-11-52.026776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T15-11-52.026776.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_03T15_11_52.026776
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T15-11-52.026776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T15-11-52.026776.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_03T15_11_52.026776
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T15-11-52.026776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T15-11-52.026776.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_03T15_11_52.026776
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T15-11-52.026776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T15-11-52.026776.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_03T15_11_52.026776
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T15-11-52.026776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T15-11-52.026776.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_03T15_11_52.026776
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T15-11-52.026776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T15-11-52.026776.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_03T15_11_52.026776
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T15-11-52.026776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T15-11-52.026776.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_03T15_11_52.026776
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T15-11-52.026776.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T15-11-52.026776.parquet'
- config_name: results
data_files:
- split: 2023_10_03T15_11_52.026776
path:
- results_2023-10-03T15-11-52.026776.parquet
- split: latest
path:
- results_2023-10-03T15-11-52.026776.parquet
---
# Dataset Card for Evaluation run of NousResearch/Capybara-7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/NousResearch/Capybara-7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [NousResearch/Capybara-7B](https://huggingface.co/NousResearch/Capybara-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NousResearch__Capybara-7B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-03T15:11:52.026776](https://huggingface.co/datasets/open-llm-leaderboard/details_NousResearch__Capybara-7B/blob/main/results_2023-10-03T15-11-52.026776.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4906960835084916,
"acc_stderr": 0.034810613237060924,
"acc_norm": 0.49454145605365324,
"acc_norm_stderr": 0.03479380901358992,
"mc1": 0.34516523867809057,
"mc1_stderr": 0.01664310331927494,
"mc2": 0.5107046569316315,
"mc2_stderr": 0.01581761558569416
},
"harness|arc:challenge|25": {
"acc": 0.5170648464163823,
"acc_stderr": 0.0146028783885366,
"acc_norm": 0.5520477815699659,
"acc_norm_stderr": 0.01453201149821168
},
"harness|hellaswag|10": {
"acc": 0.6157140011949811,
"acc_stderr": 0.004854318994447738,
"acc_norm": 0.8076080462059351,
"acc_norm_stderr": 0.003933736699983618
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5111111111111111,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.5111111111111111,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4342105263157895,
"acc_stderr": 0.0403356566784832,
"acc_norm": 0.4342105263157895,
"acc_norm_stderr": 0.0403356566784832
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5320754716981132,
"acc_stderr": 0.030709486992556545,
"acc_norm": 0.5320754716981132,
"acc_norm_stderr": 0.030709486992556545
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4930555555555556,
"acc_stderr": 0.041808067502949374,
"acc_norm": 0.4930555555555556,
"acc_norm_stderr": 0.041808067502949374
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768077,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768077
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720685,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720685
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.44508670520231214,
"acc_stderr": 0.03789401760283647,
"acc_norm": 0.44508670520231214,
"acc_norm_stderr": 0.03789401760283647
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.04389869956808778,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.04389869956808778
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.44680851063829785,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.44680851063829785,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813344,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813344
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.32413793103448274,
"acc_stderr": 0.03900432069185555,
"acc_norm": 0.32413793103448274,
"acc_norm_stderr": 0.03900432069185555
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30687830687830686,
"acc_stderr": 0.023752928712112143,
"acc_norm": 0.30687830687830686,
"acc_norm_stderr": 0.023752928712112143
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.0416345303130286,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.0416345303130286
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5612903225806452,
"acc_stderr": 0.028229497320317213,
"acc_norm": 0.5612903225806452,
"acc_norm_stderr": 0.028229497320317213
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.35960591133004927,
"acc_stderr": 0.03376458246509567,
"acc_norm": 0.35960591133004927,
"acc_norm_stderr": 0.03376458246509567
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6606060606060606,
"acc_stderr": 0.03697442205031596,
"acc_norm": 0.6606060606060606,
"acc_norm_stderr": 0.03697442205031596
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03358618145732522,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03358618145732522
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7512953367875648,
"acc_stderr": 0.0311958408777003,
"acc_norm": 0.7512953367875648,
"acc_norm_stderr": 0.0311958408777003
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.46153846153846156,
"acc_stderr": 0.025275892070240637,
"acc_norm": 0.46153846153846156,
"acc_norm_stderr": 0.025275892070240637
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.027195934804085622,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.027195934804085622
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.46638655462184875,
"acc_stderr": 0.03240501447690071,
"acc_norm": 0.46638655462184875,
"acc_norm_stderr": 0.03240501447690071
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6880733944954128,
"acc_stderr": 0.019862967976707245,
"acc_norm": 0.6880733944954128,
"acc_norm_stderr": 0.019862967976707245
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.32407407407407407,
"acc_stderr": 0.03191923445686185,
"acc_norm": 0.32407407407407407,
"acc_norm_stderr": 0.03191923445686185
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.03296245110172229,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.03296245110172229
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.70042194092827,
"acc_stderr": 0.02981802474975309,
"acc_norm": 0.70042194092827,
"acc_norm_stderr": 0.02981802474975309
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6143497757847534,
"acc_stderr": 0.03266842214289201,
"acc_norm": 0.6143497757847534,
"acc_norm_stderr": 0.03266842214289201
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.549618320610687,
"acc_stderr": 0.04363643698524779,
"acc_norm": 0.549618320610687,
"acc_norm_stderr": 0.04363643698524779
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6446280991735537,
"acc_stderr": 0.04369236326573981,
"acc_norm": 0.6446280991735537,
"acc_norm_stderr": 0.04369236326573981
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.04712821257426769,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.04712821257426769
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6196319018404908,
"acc_stderr": 0.038142698932618374,
"acc_norm": 0.6196319018404908,
"acc_norm_stderr": 0.038142698932618374
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.6116504854368932,
"acc_stderr": 0.04825729337356389,
"acc_norm": 0.6116504854368932,
"acc_norm_stderr": 0.04825729337356389
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7435897435897436,
"acc_stderr": 0.028605953702004257,
"acc_norm": 0.7435897435897436,
"acc_norm_stderr": 0.028605953702004257
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.58,
"acc_stderr": 0.04960449637488583,
"acc_norm": 0.58,
"acc_norm_stderr": 0.04960449637488583
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.698595146871009,
"acc_stderr": 0.01640909109726878,
"acc_norm": 0.698595146871009,
"acc_norm_stderr": 0.01640909109726878
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5317919075144508,
"acc_stderr": 0.026864624366756643,
"acc_norm": 0.5317919075144508,
"acc_norm_stderr": 0.026864624366756643
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23910614525139665,
"acc_stderr": 0.014265554192331144,
"acc_norm": 0.23910614525139665,
"acc_norm_stderr": 0.014265554192331144
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4738562091503268,
"acc_stderr": 0.028590752958852394,
"acc_norm": 0.4738562091503268,
"acc_norm_stderr": 0.028590752958852394
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6012861736334405,
"acc_stderr": 0.0278093225857745,
"acc_norm": 0.6012861736334405,
"acc_norm_stderr": 0.0278093225857745
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5740740740740741,
"acc_stderr": 0.027513747284379424,
"acc_norm": 0.5740740740740741,
"acc_norm_stderr": 0.027513747284379424
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.36879432624113473,
"acc_stderr": 0.02878222756134724,
"acc_norm": 0.36879432624113473,
"acc_norm_stderr": 0.02878222756134724
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.378748370273794,
"acc_stderr": 0.012389052105003727,
"acc_norm": 0.378748370273794,
"acc_norm_stderr": 0.012389052105003727
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5183823529411765,
"acc_stderr": 0.03035230339535197,
"acc_norm": 0.5183823529411765,
"acc_norm_stderr": 0.03035230339535197
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.477124183006536,
"acc_stderr": 0.020206653187884786,
"acc_norm": 0.477124183006536,
"acc_norm_stderr": 0.020206653187884786
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5909090909090909,
"acc_stderr": 0.04709306978661896,
"acc_norm": 0.5909090909090909,
"acc_norm_stderr": 0.04709306978661896
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.43673469387755104,
"acc_stderr": 0.03175195237583322,
"acc_norm": 0.43673469387755104,
"acc_norm_stderr": 0.03175195237583322
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6865671641791045,
"acc_stderr": 0.03280188205348643,
"acc_norm": 0.6865671641791045,
"acc_norm_stderr": 0.03280188205348643
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3855421686746988,
"acc_stderr": 0.03789134424611551,
"acc_norm": 0.3855421686746988,
"acc_norm_stderr": 0.03789134424611551
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.695906432748538,
"acc_stderr": 0.0352821125824523,
"acc_norm": 0.695906432748538,
"acc_norm_stderr": 0.0352821125824523
},
"harness|truthfulqa:mc|0": {
"mc1": 0.34516523867809057,
"mc1_stderr": 0.01664310331927494,
"mc2": 0.5107046569316315,
"mc2_stderr": 0.01581761558569416
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
abhayzala/TRUSTDataFiles | 2023-10-03T15:33:46.000Z | [
"region:us"
] | abhayzala | null | null | null | 0 | 0 | Required data files for using the TRUST model for skin tone analysis for the [DALL-Eval: Probing the Reasoning Skills and Social Biases of Text-to-Image Generation Models (ICCV 2023)](https://github.com/j-min/DallEval) paper.
Please note that these files merged together and are uploaded here for convenience but can be obtained otherwise by the original [TRUST repo](https://github.com/HavenFeng/TRUST).
|
Yuri3/Aqua | 2023-10-03T15:23:04.000Z | [
"region:us"
] | Yuri3 | null | null | null | 0 | 0 | Entry not found |
Icaruas/goodwrite | 2023-10-03T20:48:16.000Z | [
"region:us"
] | Icaruas | null | null | null | 1 | 0 | Entry not found |
open-llm-leaderboard/details_wei123602__Llama-2-13b-FINETUNE4_TEST2 | 2023-10-03T15:38:01.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of wei123602/Llama-2-13b-FINETUNE4_TEST2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [wei123602/Llama-2-13b-FINETUNE4_TEST2](https://huggingface.co/wei123602/Llama-2-13b-FINETUNE4_TEST2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_wei123602__Llama-2-13b-FINETUNE4_TEST2\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-03T15:36:38.191985](https://huggingface.co/datasets/open-llm-leaderboard/details_wei123602__Llama-2-13b-FINETUNE4_TEST2/blob/main/results_2023-10-03T15-36-38.191985.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5664208490341656,\n\
\ \"acc_stderr\": 0.03443990811908472,\n \"acc_norm\": 0.5706784746136042,\n\
\ \"acc_norm_stderr\": 0.034420058031149,\n \"mc1\": 0.27539779681762544,\n\
\ \"mc1_stderr\": 0.01563813566777552,\n \"mc2\": 0.40185287585334617,\n\
\ \"mc2_stderr\": 0.014323074457881934\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5401023890784983,\n \"acc_stderr\": 0.014564318856924848,\n\
\ \"acc_norm\": 0.5844709897610921,\n \"acc_norm_stderr\": 0.014401366641216377\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6101374228241386,\n\
\ \"acc_stderr\": 0.004867221634461272,\n \"acc_norm\": 0.8169687313284206,\n\
\ \"acc_norm_stderr\": 0.0038590186619619944\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5037037037037037,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.5037037037037037,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5789473684210527,\n \"acc_stderr\": 0.040179012759817494,\n\
\ \"acc_norm\": 0.5789473684210527,\n \"acc_norm_stderr\": 0.040179012759817494\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5962264150943396,\n \"acc_stderr\": 0.03019761160019795,\n\
\ \"acc_norm\": 0.5962264150943396,\n \"acc_norm_stderr\": 0.03019761160019795\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5902777777777778,\n\
\ \"acc_stderr\": 0.04112490974670787,\n \"acc_norm\": 0.5902777777777778,\n\
\ \"acc_norm_stderr\": 0.04112490974670787\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n\
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.48554913294797686,\n\
\ \"acc_stderr\": 0.03810871630454764,\n \"acc_norm\": 0.48554913294797686,\n\
\ \"acc_norm_stderr\": 0.03810871630454764\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n\
\ \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4425531914893617,\n \"acc_stderr\": 0.03246956919789958,\n\
\ \"acc_norm\": 0.4425531914893617,\n \"acc_norm_stderr\": 0.03246956919789958\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n\
\ \"acc_stderr\": 0.04303684033537314,\n \"acc_norm\": 0.2982456140350877,\n\
\ \"acc_norm_stderr\": 0.04303684033537314\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.34656084656084657,\n \"acc_stderr\": 0.02450877752102841,\n \"\
acc_norm\": 0.34656084656084657,\n \"acc_norm_stderr\": 0.02450877752102841\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.31746031746031744,\n\
\ \"acc_stderr\": 0.04163453031302859,\n \"acc_norm\": 0.31746031746031744,\n\
\ \"acc_norm_stderr\": 0.04163453031302859\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6419354838709678,\n \"acc_stderr\": 0.027273890594300642,\n \"\
acc_norm\": 0.6419354838709678,\n \"acc_norm_stderr\": 0.027273890594300642\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4088669950738916,\n \"acc_stderr\": 0.034590588158832314,\n \"\
acc_norm\": 0.4088669950738916,\n \"acc_norm_stderr\": 0.034590588158832314\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\"\
: 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7151515151515152,\n \"acc_stderr\": 0.03524390844511781,\n\
\ \"acc_norm\": 0.7151515151515152,\n \"acc_norm_stderr\": 0.03524390844511781\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7121212121212122,\n \"acc_stderr\": 0.03225883512300992,\n \"\
acc_norm\": 0.7121212121212122,\n \"acc_norm_stderr\": 0.03225883512300992\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.844559585492228,\n \"acc_stderr\": 0.026148483469153303,\n\
\ \"acc_norm\": 0.844559585492228,\n \"acc_norm_stderr\": 0.026148483469153303\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5564102564102564,\n \"acc_stderr\": 0.025189149894764205,\n\
\ \"acc_norm\": 0.5564102564102564,\n \"acc_norm_stderr\": 0.025189149894764205\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34444444444444444,\n \"acc_stderr\": 0.028972648884844267,\n \
\ \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.028972648884844267\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6050420168067226,\n \"acc_stderr\": 0.031753678460966245,\n\
\ \"acc_norm\": 0.6050420168067226,\n \"acc_norm_stderr\": 0.031753678460966245\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.0386155754625517,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.0386155754625517\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.781651376146789,\n \"acc_stderr\": 0.017712600528722717,\n \"\
acc_norm\": 0.781651376146789,\n \"acc_norm_stderr\": 0.017712600528722717\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.46296296296296297,\n \"acc_stderr\": 0.03400603625538271,\n \"\
acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.03400603625538271\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7401960784313726,\n \"acc_stderr\": 0.03077855467869327,\n \"\
acc_norm\": 0.7401960784313726,\n \"acc_norm_stderr\": 0.03077855467869327\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \
\ \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6547085201793722,\n\
\ \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.6547085201793722,\n\
\ \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6412213740458015,\n \"acc_stderr\": 0.04206739313864908,\n\
\ \"acc_norm\": 0.6412213740458015,\n \"acc_norm_stderr\": 0.04206739313864908\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302871,\n \"acc_norm\"\
: 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302871\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6932515337423313,\n \"acc_stderr\": 0.03623089915724147,\n\
\ \"acc_norm\": 0.6932515337423313,\n \"acc_norm_stderr\": 0.03623089915724147\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3392857142857143,\n\
\ \"acc_stderr\": 0.04493949068613539,\n \"acc_norm\": 0.3392857142857143,\n\
\ \"acc_norm_stderr\": 0.04493949068613539\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7863247863247863,\n\
\ \"acc_stderr\": 0.026853450377009168,\n \"acc_norm\": 0.7863247863247863,\n\
\ \"acc_norm_stderr\": 0.026853450377009168\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \
\ \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7790549169859514,\n\
\ \"acc_stderr\": 0.014836205167333564,\n \"acc_norm\": 0.7790549169859514,\n\
\ \"acc_norm_stderr\": 0.014836205167333564\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6242774566473989,\n \"acc_stderr\": 0.026074314851657083,\n\
\ \"acc_norm\": 0.6242774566473989,\n \"acc_norm_stderr\": 0.026074314851657083\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3139664804469274,\n\
\ \"acc_stderr\": 0.015521923933523649,\n \"acc_norm\": 0.3139664804469274,\n\
\ \"acc_norm_stderr\": 0.015521923933523649\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5947712418300654,\n \"acc_stderr\": 0.028110928492809075,\n\
\ \"acc_norm\": 0.5947712418300654,\n \"acc_norm_stderr\": 0.028110928492809075\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6559485530546624,\n\
\ \"acc_stderr\": 0.026981478043648043,\n \"acc_norm\": 0.6559485530546624,\n\
\ \"acc_norm_stderr\": 0.026981478043648043\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6481481481481481,\n \"acc_stderr\": 0.026571483480719967,\n\
\ \"acc_norm\": 0.6481481481481481,\n \"acc_norm_stderr\": 0.026571483480719967\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4574468085106383,\n \"acc_stderr\": 0.029719281272236848,\n \
\ \"acc_norm\": 0.4574468085106383,\n \"acc_norm_stderr\": 0.029719281272236848\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44784876140808344,\n\
\ \"acc_stderr\": 0.012700582404768217,\n \"acc_norm\": 0.44784876140808344,\n\
\ \"acc_norm_stderr\": 0.012700582404768217\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5551470588235294,\n \"acc_stderr\": 0.030187532060329383,\n\
\ \"acc_norm\": 0.5551470588235294,\n \"acc_norm_stderr\": 0.030187532060329383\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5751633986928104,\n \"acc_stderr\": 0.01999797303545833,\n \
\ \"acc_norm\": 0.5751633986928104,\n \"acc_norm_stderr\": 0.01999797303545833\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n\
\ \"acc_stderr\": 0.04631381319425465,\n \"acc_norm\": 0.6272727272727273,\n\
\ \"acc_norm_stderr\": 0.04631381319425465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5346938775510204,\n \"acc_stderr\": 0.03193207024425314,\n\
\ \"acc_norm\": 0.5346938775510204,\n \"acc_norm_stderr\": 0.03193207024425314\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.736318407960199,\n\
\ \"acc_stderr\": 0.031157150869355554,\n \"acc_norm\": 0.736318407960199,\n\
\ \"acc_norm_stderr\": 0.031157150869355554\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816505,\n \
\ \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816505\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42168674698795183,\n\
\ \"acc_stderr\": 0.03844453181770917,\n \"acc_norm\": 0.42168674698795183,\n\
\ \"acc_norm_stderr\": 0.03844453181770917\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7602339181286549,\n \"acc_stderr\": 0.03274485211946956,\n\
\ \"acc_norm\": 0.7602339181286549,\n \"acc_norm_stderr\": 0.03274485211946956\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.27539779681762544,\n\
\ \"mc1_stderr\": 0.01563813566777552,\n \"mc2\": 0.40185287585334617,\n\
\ \"mc2_stderr\": 0.014323074457881934\n }\n}\n```"
repo_url: https://huggingface.co/wei123602/Llama-2-13b-FINETUNE4_TEST2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_03T15_36_38.191985
path:
- '**/details_harness|arc:challenge|25_2023-10-03T15-36-38.191985.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-03T15-36-38.191985.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_03T15_36_38.191985
path:
- '**/details_harness|hellaswag|10_2023-10-03T15-36-38.191985.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-03T15-36-38.191985.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_03T15_36_38.191985
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T15-36-38.191985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T15-36-38.191985.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T15-36-38.191985.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_03T15_36_38.191985
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T15-36-38.191985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T15-36-38.191985.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_03T15_36_38.191985
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T15-36-38.191985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T15-36-38.191985.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_03T15_36_38.191985
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T15-36-38.191985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T15-36-38.191985.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_03T15_36_38.191985
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T15-36-38.191985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T15-36-38.191985.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_03T15_36_38.191985
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T15-36-38.191985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T15-36-38.191985.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_03T15_36_38.191985
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T15-36-38.191985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T15-36-38.191985.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_03T15_36_38.191985
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T15-36-38.191985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T15-36-38.191985.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_03T15_36_38.191985
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T15-36-38.191985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T15-36-38.191985.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_03T15_36_38.191985
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T15-36-38.191985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T15-36-38.191985.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_03T15_36_38.191985
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T15-36-38.191985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T15-36-38.191985.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_03T15_36_38.191985
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T15-36-38.191985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T15-36-38.191985.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_03T15_36_38.191985
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T15-36-38.191985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T15-36-38.191985.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_03T15_36_38.191985
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T15-36-38.191985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T15-36-38.191985.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_03T15_36_38.191985
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T15-36-38.191985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T15-36-38.191985.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_03T15_36_38.191985
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T15-36-38.191985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T15-36-38.191985.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_03T15_36_38.191985
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T15-36-38.191985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T15-36-38.191985.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_03T15_36_38.191985
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T15-36-38.191985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T15-36-38.191985.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_03T15_36_38.191985
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T15-36-38.191985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T15-36-38.191985.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_03T15_36_38.191985
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T15-36-38.191985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T15-36-38.191985.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_03T15_36_38.191985
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T15-36-38.191985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T15-36-38.191985.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_03T15_36_38.191985
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T15-36-38.191985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T15-36-38.191985.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_03T15_36_38.191985
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T15-36-38.191985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T15-36-38.191985.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_03T15_36_38.191985
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T15-36-38.191985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T15-36-38.191985.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_03T15_36_38.191985
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T15-36-38.191985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T15-36-38.191985.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_03T15_36_38.191985
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T15-36-38.191985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T15-36-38.191985.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_03T15_36_38.191985
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T15-36-38.191985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T15-36-38.191985.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_03T15_36_38.191985
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T15-36-38.191985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T15-36-38.191985.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_03T15_36_38.191985
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T15-36-38.191985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T15-36-38.191985.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_03T15_36_38.191985
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T15-36-38.191985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T15-36-38.191985.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_03T15_36_38.191985
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T15-36-38.191985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T15-36-38.191985.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_03T15_36_38.191985
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T15-36-38.191985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T15-36-38.191985.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_03T15_36_38.191985
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T15-36-38.191985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T15-36-38.191985.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_03T15_36_38.191985
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T15-36-38.191985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T15-36-38.191985.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_03T15_36_38.191985
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T15-36-38.191985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T15-36-38.191985.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_03T15_36_38.191985
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T15-36-38.191985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T15-36-38.191985.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_03T15_36_38.191985
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T15-36-38.191985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T15-36-38.191985.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_03T15_36_38.191985
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T15-36-38.191985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T15-36-38.191985.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_03T15_36_38.191985
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T15-36-38.191985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T15-36-38.191985.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_03T15_36_38.191985
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T15-36-38.191985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T15-36-38.191985.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_03T15_36_38.191985
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T15-36-38.191985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T15-36-38.191985.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_03T15_36_38.191985
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T15-36-38.191985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T15-36-38.191985.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_03T15_36_38.191985
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T15-36-38.191985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T15-36-38.191985.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_03T15_36_38.191985
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T15-36-38.191985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T15-36-38.191985.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_03T15_36_38.191985
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T15-36-38.191985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T15-36-38.191985.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_03T15_36_38.191985
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T15-36-38.191985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T15-36-38.191985.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_03T15_36_38.191985
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T15-36-38.191985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T15-36-38.191985.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_03T15_36_38.191985
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T15-36-38.191985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T15-36-38.191985.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_03T15_36_38.191985
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T15-36-38.191985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T15-36-38.191985.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_03T15_36_38.191985
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T15-36-38.191985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T15-36-38.191985.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_03T15_36_38.191985
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T15-36-38.191985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T15-36-38.191985.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_03T15_36_38.191985
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T15-36-38.191985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T15-36-38.191985.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_03T15_36_38.191985
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T15-36-38.191985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T15-36-38.191985.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_03T15_36_38.191985
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T15-36-38.191985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T15-36-38.191985.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_03T15_36_38.191985
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T15-36-38.191985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T15-36-38.191985.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_03T15_36_38.191985
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T15-36-38.191985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T15-36-38.191985.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_03T15_36_38.191985
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T15-36-38.191985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T15-36-38.191985.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_03T15_36_38.191985
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T15-36-38.191985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T15-36-38.191985.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_03T15_36_38.191985
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T15-36-38.191985.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T15-36-38.191985.parquet'
- config_name: results
data_files:
- split: 2023_10_03T15_36_38.191985
path:
- results_2023-10-03T15-36-38.191985.parquet
- split: latest
path:
- results_2023-10-03T15-36-38.191985.parquet
---
# Dataset Card for Evaluation run of wei123602/Llama-2-13b-FINETUNE4_TEST2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/wei123602/Llama-2-13b-FINETUNE4_TEST2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [wei123602/Llama-2-13b-FINETUNE4_TEST2](https://huggingface.co/wei123602/Llama-2-13b-FINETUNE4_TEST2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_wei123602__Llama-2-13b-FINETUNE4_TEST2",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-03T15:36:38.191985](https://huggingface.co/datasets/open-llm-leaderboard/details_wei123602__Llama-2-13b-FINETUNE4_TEST2/blob/main/results_2023-10-03T15-36-38.191985.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5664208490341656,
"acc_stderr": 0.03443990811908472,
"acc_norm": 0.5706784746136042,
"acc_norm_stderr": 0.034420058031149,
"mc1": 0.27539779681762544,
"mc1_stderr": 0.01563813566777552,
"mc2": 0.40185287585334617,
"mc2_stderr": 0.014323074457881934
},
"harness|arc:challenge|25": {
"acc": 0.5401023890784983,
"acc_stderr": 0.014564318856924848,
"acc_norm": 0.5844709897610921,
"acc_norm_stderr": 0.014401366641216377
},
"harness|hellaswag|10": {
"acc": 0.6101374228241386,
"acc_stderr": 0.004867221634461272,
"acc_norm": 0.8169687313284206,
"acc_norm_stderr": 0.0038590186619619944
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5037037037037037,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.5037037037037037,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5789473684210527,
"acc_stderr": 0.040179012759817494,
"acc_norm": 0.5789473684210527,
"acc_norm_stderr": 0.040179012759817494
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5962264150943396,
"acc_stderr": 0.03019761160019795,
"acc_norm": 0.5962264150943396,
"acc_norm_stderr": 0.03019761160019795
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5902777777777778,
"acc_stderr": 0.04112490974670787,
"acc_norm": 0.5902777777777778,
"acc_norm_stderr": 0.04112490974670787
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.48554913294797686,
"acc_stderr": 0.03810871630454764,
"acc_norm": 0.48554913294797686,
"acc_norm_stderr": 0.03810871630454764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4425531914893617,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.4425531914893617,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.04303684033537314,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.04303684033537314
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.04164188720169375,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.04164188720169375
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.34656084656084657,
"acc_stderr": 0.02450877752102841,
"acc_norm": 0.34656084656084657,
"acc_norm_stderr": 0.02450877752102841
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.04163453031302859,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.04163453031302859
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6419354838709678,
"acc_stderr": 0.027273890594300642,
"acc_norm": 0.6419354838709678,
"acc_norm_stderr": 0.027273890594300642
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4088669950738916,
"acc_stderr": 0.034590588158832314,
"acc_norm": 0.4088669950738916,
"acc_norm_stderr": 0.034590588158832314
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7151515151515152,
"acc_stderr": 0.03524390844511781,
"acc_norm": 0.7151515151515152,
"acc_norm_stderr": 0.03524390844511781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7121212121212122,
"acc_stderr": 0.03225883512300992,
"acc_norm": 0.7121212121212122,
"acc_norm_stderr": 0.03225883512300992
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.844559585492228,
"acc_stderr": 0.026148483469153303,
"acc_norm": 0.844559585492228,
"acc_norm_stderr": 0.026148483469153303
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5564102564102564,
"acc_stderr": 0.025189149894764205,
"acc_norm": 0.5564102564102564,
"acc_norm_stderr": 0.025189149894764205
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.028972648884844267,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.028972648884844267
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6050420168067226,
"acc_stderr": 0.031753678460966245,
"acc_norm": 0.6050420168067226,
"acc_norm_stderr": 0.031753678460966245
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.0386155754625517,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.0386155754625517
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.781651376146789,
"acc_stderr": 0.017712600528722717,
"acc_norm": 0.781651376146789,
"acc_norm_stderr": 0.017712600528722717
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.03400603625538271,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.03400603625538271
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7401960784313726,
"acc_stderr": 0.03077855467869327,
"acc_norm": 0.7401960784313726,
"acc_norm_stderr": 0.03077855467869327
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7805907172995781,
"acc_stderr": 0.026939106581553945,
"acc_norm": 0.7805907172995781,
"acc_norm_stderr": 0.026939106581553945
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6547085201793722,
"acc_stderr": 0.03191100192835794,
"acc_norm": 0.6547085201793722,
"acc_norm_stderr": 0.03191100192835794
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6412213740458015,
"acc_stderr": 0.04206739313864908,
"acc_norm": 0.6412213740458015,
"acc_norm_stderr": 0.04206739313864908
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302871,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302871
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6932515337423313,
"acc_stderr": 0.03623089915724147,
"acc_norm": 0.6932515337423313,
"acc_norm_stderr": 0.03623089915724147
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3392857142857143,
"acc_stderr": 0.04493949068613539,
"acc_norm": 0.3392857142857143,
"acc_norm_stderr": 0.04493949068613539
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7863247863247863,
"acc_stderr": 0.026853450377009168,
"acc_norm": 0.7863247863247863,
"acc_norm_stderr": 0.026853450377009168
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7790549169859514,
"acc_stderr": 0.014836205167333564,
"acc_norm": 0.7790549169859514,
"acc_norm_stderr": 0.014836205167333564
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.026074314851657083,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.026074314851657083
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3139664804469274,
"acc_stderr": 0.015521923933523649,
"acc_norm": 0.3139664804469274,
"acc_norm_stderr": 0.015521923933523649
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5947712418300654,
"acc_stderr": 0.028110928492809075,
"acc_norm": 0.5947712418300654,
"acc_norm_stderr": 0.028110928492809075
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6559485530546624,
"acc_stderr": 0.026981478043648043,
"acc_norm": 0.6559485530546624,
"acc_norm_stderr": 0.026981478043648043
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6481481481481481,
"acc_stderr": 0.026571483480719967,
"acc_norm": 0.6481481481481481,
"acc_norm_stderr": 0.026571483480719967
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4574468085106383,
"acc_stderr": 0.029719281272236848,
"acc_norm": 0.4574468085106383,
"acc_norm_stderr": 0.029719281272236848
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44784876140808344,
"acc_stderr": 0.012700582404768217,
"acc_norm": 0.44784876140808344,
"acc_norm_stderr": 0.012700582404768217
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5551470588235294,
"acc_stderr": 0.030187532060329383,
"acc_norm": 0.5551470588235294,
"acc_norm_stderr": 0.030187532060329383
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5751633986928104,
"acc_stderr": 0.01999797303545833,
"acc_norm": 0.5751633986928104,
"acc_norm_stderr": 0.01999797303545833
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.04631381319425465,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.04631381319425465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5346938775510204,
"acc_stderr": 0.03193207024425314,
"acc_norm": 0.5346938775510204,
"acc_norm_stderr": 0.03193207024425314
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.736318407960199,
"acc_stderr": 0.031157150869355554,
"acc_norm": 0.736318407960199,
"acc_norm_stderr": 0.031157150869355554
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42168674698795183,
"acc_stderr": 0.03844453181770917,
"acc_norm": 0.42168674698795183,
"acc_norm_stderr": 0.03844453181770917
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7602339181286549,
"acc_stderr": 0.03274485211946956,
"acc_norm": 0.7602339181286549,
"acc_norm_stderr": 0.03274485211946956
},
"harness|truthfulqa:mc|0": {
"mc1": 0.27539779681762544,
"mc1_stderr": 0.01563813566777552,
"mc2": 0.40185287585334617,
"mc2_stderr": 0.014323074457881934
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
atom-in-the-universe/bild-15dc19d9-0cd0-4182-9c31-baf191d8b5eb | 2023-10-03T16:09:06.000Z | [
"region:us"
] | atom-in-the-universe | null | null | null | 0 | 0 | Entry not found |
atom-in-the-universe/bild-e3031b87-ff91-4961-acce-73e2269a264a | 2023-10-03T16:13:08.000Z | [
"region:us"
] | atom-in-the-universe | null | null | null | 0 | 0 | Entry not found |
WayneWei017/research_test | 2023-10-03T16:49:27.000Z | [
"region:us"
] | WayneWei017 | null | null | null | 0 | 0 | Entry not found |
atom-in-the-universe/bild-9922c341-3758-41ab-94a8-fc2574ed1b9e | 2023-10-03T16:28:22.000Z | [
"region:us"
] | atom-in-the-universe | null | null | null | 0 | 0 | Entry not found |
BangumiBase/katanagatari | 2023-10-03T17:55:57.000Z | [
"size_categories:1K<n<10K",
"license:mit",
"art",
"region:us"
] | BangumiBase | null | null | null | 0 | 0 | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Katanagatari
This is the image base of bangumi Katanagatari, we detected 22 characters, 2116 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 89 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 32 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 32 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 62 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 17 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 13 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 15 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 21 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 9 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 791 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 60 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 21 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 19 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 586 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 54 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 24 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 19 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 7 | [Download](17/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 18 | 18 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 8 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 64 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 155 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
atom-in-the-universe/bild-3d2df0fe-a07c-40f3-be73-a81f664bf5f8 | 2023-10-03T16:33:32.000Z | [
"region:us"
] | atom-in-the-universe | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_Undi95__ReMM-v2.2-L2-13B | 2023-10-03T16:46:44.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Undi95/ReMM-v2.2-L2-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Undi95/ReMM-v2.2-L2-13B](https://huggingface.co/Undi95/ReMM-v2.2-L2-13B) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Undi95__ReMM-v2.2-L2-13B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-03T16:45:21.105610](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__ReMM-v2.2-L2-13B/blob/main/results_2023-10-03T16-45-21.105610.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.564002539739021,\n\
\ \"acc_stderr\": 0.034462415944422044,\n \"acc_norm\": 0.567760182755492,\n\
\ \"acc_norm_stderr\": 0.034440565206339785,\n \"mc1\": 0.36964504283965727,\n\
\ \"mc1_stderr\": 0.016898180706973888,\n \"mc2\": 0.5135116682163505,\n\
\ \"mc2_stderr\": 0.015657648011440012\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5836177474402731,\n \"acc_stderr\": 0.01440561827943618,\n\
\ \"acc_norm\": 0.6126279863481229,\n \"acc_norm_stderr\": 0.014235872487909869\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6488747261501693,\n\
\ \"acc_stderr\": 0.004763465139038559,\n \"acc_norm\": 0.8415654252141008,\n\
\ \"acc_norm_stderr\": 0.003644017383711596\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5111111111111111,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.5111111111111111,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5328947368421053,\n \"acc_stderr\": 0.040601270352363966,\n\
\ \"acc_norm\": 0.5328947368421053,\n \"acc_norm_stderr\": 0.040601270352363966\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5924528301886792,\n \"acc_stderr\": 0.030242233800854494,\n\
\ \"acc_norm\": 0.5924528301886792,\n \"acc_norm_stderr\": 0.030242233800854494\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6041666666666666,\n\
\ \"acc_stderr\": 0.04089465449325582,\n \"acc_norm\": 0.6041666666666666,\n\
\ \"acc_norm_stderr\": 0.04089465449325582\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5317919075144508,\n\
\ \"acc_stderr\": 0.03804749744364764,\n \"acc_norm\": 0.5317919075144508,\n\
\ \"acc_norm_stderr\": 0.03804749744364764\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.04280105837364397,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.04280105837364397\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.04560480215720685,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.04560480215720685\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4723404255319149,\n \"acc_stderr\": 0.03263597118409769,\n\
\ \"acc_norm\": 0.4723404255319149,\n \"acc_norm_stderr\": 0.03263597118409769\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3439153439153439,\n \"acc_stderr\": 0.024464426625596433,\n \"\
acc_norm\": 0.3439153439153439,\n \"acc_norm_stderr\": 0.024464426625596433\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3492063492063492,\n\
\ \"acc_stderr\": 0.04263906892795132,\n \"acc_norm\": 0.3492063492063492,\n\
\ \"acc_norm_stderr\": 0.04263906892795132\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6483870967741936,\n\
\ \"acc_stderr\": 0.02716253782694846,\n \"acc_norm\": 0.6483870967741936,\n\
\ \"acc_norm_stderr\": 0.02716253782694846\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.43842364532019706,\n \"acc_stderr\": 0.03491207857486518,\n\
\ \"acc_norm\": 0.43842364532019706,\n \"acc_norm_stderr\": 0.03491207857486518\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.0368105086916155,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.0368105086916155\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.702020202020202,\n \"acc_stderr\": 0.03258630383836556,\n \"acc_norm\"\
: 0.702020202020202,\n \"acc_norm_stderr\": 0.03258630383836556\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.7979274611398963,\n \"acc_stderr\": 0.028979089794296732,\n\
\ \"acc_norm\": 0.7979274611398963,\n \"acc_norm_stderr\": 0.028979089794296732\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5256410256410257,\n \"acc_stderr\": 0.025317649726448663,\n\
\ \"acc_norm\": 0.5256410256410257,\n \"acc_norm_stderr\": 0.025317649726448663\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32592592592592595,\n \"acc_stderr\": 0.028578348365473072,\n \
\ \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.028578348365473072\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5966386554621849,\n \"acc_stderr\": 0.031866081214088314,\n\
\ \"acc_norm\": 0.5966386554621849,\n \"acc_norm_stderr\": 0.031866081214088314\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7302752293577982,\n \"acc_stderr\": 0.01902848671111544,\n \"\
acc_norm\": 0.7302752293577982,\n \"acc_norm_stderr\": 0.01902848671111544\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.41203703703703703,\n \"acc_stderr\": 0.03356787758160835,\n \"\
acc_norm\": 0.41203703703703703,\n \"acc_norm_stderr\": 0.03356787758160835\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7647058823529411,\n \"acc_stderr\": 0.029771775228145628,\n \"\
acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.029771775228145628\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7637130801687764,\n \"acc_stderr\": 0.02765215314415927,\n \
\ \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.02765215314415927\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6183206106870229,\n \"acc_stderr\": 0.04260735157644559,\n\
\ \"acc_norm\": 0.6183206106870229,\n \"acc_norm_stderr\": 0.04260735157644559\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7272727272727273,\n \"acc_stderr\": 0.04065578140908705,\n \"\
acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04065578140908705\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6993865030674846,\n \"acc_stderr\": 0.03602511318806771,\n\
\ \"acc_norm\": 0.6993865030674846,\n \"acc_norm_stderr\": 0.03602511318806771\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.36607142857142855,\n\
\ \"acc_stderr\": 0.0457237235873743,\n \"acc_norm\": 0.36607142857142855,\n\
\ \"acc_norm_stderr\": 0.0457237235873743\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7087378640776699,\n \"acc_stderr\": 0.044986763205729224,\n\
\ \"acc_norm\": 0.7087378640776699,\n \"acc_norm_stderr\": 0.044986763205729224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7948717948717948,\n\
\ \"acc_stderr\": 0.026453508054040332,\n \"acc_norm\": 0.7948717948717948,\n\
\ \"acc_norm_stderr\": 0.026453508054040332\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7611749680715197,\n\
\ \"acc_stderr\": 0.015246803197398675,\n \"acc_norm\": 0.7611749680715197,\n\
\ \"acc_norm_stderr\": 0.015246803197398675\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6358381502890174,\n \"acc_stderr\": 0.025906632631016127,\n\
\ \"acc_norm\": 0.6358381502890174,\n \"acc_norm_stderr\": 0.025906632631016127\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.45363128491620114,\n\
\ \"acc_stderr\": 0.016650437588269073,\n \"acc_norm\": 0.45363128491620114,\n\
\ \"acc_norm_stderr\": 0.016650437588269073\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6143790849673203,\n \"acc_stderr\": 0.02787074527829028,\n\
\ \"acc_norm\": 0.6143790849673203,\n \"acc_norm_stderr\": 0.02787074527829028\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6430868167202572,\n\
\ \"acc_stderr\": 0.027210420375934023,\n \"acc_norm\": 0.6430868167202572,\n\
\ \"acc_norm_stderr\": 0.027210420375934023\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6203703703703703,\n \"acc_stderr\": 0.027002521034516478,\n\
\ \"acc_norm\": 0.6203703703703703,\n \"acc_norm_stderr\": 0.027002521034516478\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.425531914893617,\n \"acc_stderr\": 0.029494827600144373,\n \
\ \"acc_norm\": 0.425531914893617,\n \"acc_norm_stderr\": 0.029494827600144373\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42503259452411996,\n\
\ \"acc_stderr\": 0.012625879884891998,\n \"acc_norm\": 0.42503259452411996,\n\
\ \"acc_norm_stderr\": 0.012625879884891998\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5183823529411765,\n \"acc_stderr\": 0.030352303395351964,\n\
\ \"acc_norm\": 0.5183823529411765,\n \"acc_norm_stderr\": 0.030352303395351964\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5816993464052288,\n \"acc_stderr\": 0.019955975145835546,\n \
\ \"acc_norm\": 0.5816993464052288,\n \"acc_norm_stderr\": 0.019955975145835546\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6244897959183674,\n \"acc_stderr\": 0.03100120903989484,\n\
\ \"acc_norm\": 0.6244897959183674,\n \"acc_norm_stderr\": 0.03100120903989484\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6865671641791045,\n\
\ \"acc_stderr\": 0.032801882053486435,\n \"acc_norm\": 0.6865671641791045,\n\
\ \"acc_norm_stderr\": 0.032801882053486435\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.463855421686747,\n\
\ \"acc_stderr\": 0.03882310850890594,\n \"acc_norm\": 0.463855421686747,\n\
\ \"acc_norm_stderr\": 0.03882310850890594\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7602339181286549,\n \"acc_stderr\": 0.03274485211946956,\n\
\ \"acc_norm\": 0.7602339181286549,\n \"acc_norm_stderr\": 0.03274485211946956\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.36964504283965727,\n\
\ \"mc1_stderr\": 0.016898180706973888,\n \"mc2\": 0.5135116682163505,\n\
\ \"mc2_stderr\": 0.015657648011440012\n }\n}\n```"
repo_url: https://huggingface.co/Undi95/ReMM-v2.2-L2-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_03T16_45_21.105610
path:
- '**/details_harness|arc:challenge|25_2023-10-03T16-45-21.105610.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-03T16-45-21.105610.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_03T16_45_21.105610
path:
- '**/details_harness|hellaswag|10_2023-10-03T16-45-21.105610.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-03T16-45-21.105610.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_03T16_45_21.105610
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T16-45-21.105610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T16-45-21.105610.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T16-45-21.105610.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_03T16_45_21.105610
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T16-45-21.105610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T16-45-21.105610.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_03T16_45_21.105610
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T16-45-21.105610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T16-45-21.105610.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_03T16_45_21.105610
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T16-45-21.105610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T16-45-21.105610.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_03T16_45_21.105610
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T16-45-21.105610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T16-45-21.105610.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_03T16_45_21.105610
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T16-45-21.105610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T16-45-21.105610.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_03T16_45_21.105610
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T16-45-21.105610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T16-45-21.105610.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_03T16_45_21.105610
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T16-45-21.105610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T16-45-21.105610.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_03T16_45_21.105610
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T16-45-21.105610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T16-45-21.105610.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_03T16_45_21.105610
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T16-45-21.105610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T16-45-21.105610.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_03T16_45_21.105610
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T16-45-21.105610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T16-45-21.105610.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_03T16_45_21.105610
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T16-45-21.105610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T16-45-21.105610.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_03T16_45_21.105610
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T16-45-21.105610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T16-45-21.105610.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_03T16_45_21.105610
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T16-45-21.105610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T16-45-21.105610.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_03T16_45_21.105610
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T16-45-21.105610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T16-45-21.105610.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_03T16_45_21.105610
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T16-45-21.105610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T16-45-21.105610.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_03T16_45_21.105610
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T16-45-21.105610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T16-45-21.105610.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_03T16_45_21.105610
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T16-45-21.105610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T16-45-21.105610.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_03T16_45_21.105610
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T16-45-21.105610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T16-45-21.105610.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_03T16_45_21.105610
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T16-45-21.105610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T16-45-21.105610.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_03T16_45_21.105610
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T16-45-21.105610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T16-45-21.105610.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_03T16_45_21.105610
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T16-45-21.105610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T16-45-21.105610.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_03T16_45_21.105610
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T16-45-21.105610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T16-45-21.105610.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_03T16_45_21.105610
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T16-45-21.105610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T16-45-21.105610.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_03T16_45_21.105610
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T16-45-21.105610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T16-45-21.105610.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_03T16_45_21.105610
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T16-45-21.105610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T16-45-21.105610.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_03T16_45_21.105610
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T16-45-21.105610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T16-45-21.105610.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_03T16_45_21.105610
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T16-45-21.105610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T16-45-21.105610.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_03T16_45_21.105610
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T16-45-21.105610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T16-45-21.105610.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_03T16_45_21.105610
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T16-45-21.105610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T16-45-21.105610.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_03T16_45_21.105610
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T16-45-21.105610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T16-45-21.105610.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_03T16_45_21.105610
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T16-45-21.105610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T16-45-21.105610.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_03T16_45_21.105610
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T16-45-21.105610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T16-45-21.105610.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_03T16_45_21.105610
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T16-45-21.105610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T16-45-21.105610.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_03T16_45_21.105610
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T16-45-21.105610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T16-45-21.105610.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_03T16_45_21.105610
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T16-45-21.105610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T16-45-21.105610.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_03T16_45_21.105610
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T16-45-21.105610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T16-45-21.105610.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_03T16_45_21.105610
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T16-45-21.105610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T16-45-21.105610.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_03T16_45_21.105610
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T16-45-21.105610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T16-45-21.105610.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_03T16_45_21.105610
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T16-45-21.105610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T16-45-21.105610.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_03T16_45_21.105610
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T16-45-21.105610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T16-45-21.105610.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_03T16_45_21.105610
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T16-45-21.105610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T16-45-21.105610.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_03T16_45_21.105610
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T16-45-21.105610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T16-45-21.105610.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_03T16_45_21.105610
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T16-45-21.105610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T16-45-21.105610.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_03T16_45_21.105610
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T16-45-21.105610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T16-45-21.105610.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_03T16_45_21.105610
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T16-45-21.105610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T16-45-21.105610.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_03T16_45_21.105610
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T16-45-21.105610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T16-45-21.105610.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_03T16_45_21.105610
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T16-45-21.105610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T16-45-21.105610.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_03T16_45_21.105610
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T16-45-21.105610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T16-45-21.105610.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_03T16_45_21.105610
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T16-45-21.105610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T16-45-21.105610.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_03T16_45_21.105610
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T16-45-21.105610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T16-45-21.105610.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_03T16_45_21.105610
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T16-45-21.105610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T16-45-21.105610.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_03T16_45_21.105610
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T16-45-21.105610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T16-45-21.105610.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_03T16_45_21.105610
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T16-45-21.105610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T16-45-21.105610.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_03T16_45_21.105610
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T16-45-21.105610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T16-45-21.105610.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_03T16_45_21.105610
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T16-45-21.105610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T16-45-21.105610.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_03T16_45_21.105610
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T16-45-21.105610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T16-45-21.105610.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_03T16_45_21.105610
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T16-45-21.105610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T16-45-21.105610.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_03T16_45_21.105610
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T16-45-21.105610.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T16-45-21.105610.parquet'
- config_name: results
data_files:
- split: 2023_10_03T16_45_21.105610
path:
- results_2023-10-03T16-45-21.105610.parquet
- split: latest
path:
- results_2023-10-03T16-45-21.105610.parquet
---
# Dataset Card for Evaluation run of Undi95/ReMM-v2.2-L2-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Undi95/ReMM-v2.2-L2-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Undi95/ReMM-v2.2-L2-13B](https://huggingface.co/Undi95/ReMM-v2.2-L2-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Undi95__ReMM-v2.2-L2-13B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-03T16:45:21.105610](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__ReMM-v2.2-L2-13B/blob/main/results_2023-10-03T16-45-21.105610.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.564002539739021,
"acc_stderr": 0.034462415944422044,
"acc_norm": 0.567760182755492,
"acc_norm_stderr": 0.034440565206339785,
"mc1": 0.36964504283965727,
"mc1_stderr": 0.016898180706973888,
"mc2": 0.5135116682163505,
"mc2_stderr": 0.015657648011440012
},
"harness|arc:challenge|25": {
"acc": 0.5836177474402731,
"acc_stderr": 0.01440561827943618,
"acc_norm": 0.6126279863481229,
"acc_norm_stderr": 0.014235872487909869
},
"harness|hellaswag|10": {
"acc": 0.6488747261501693,
"acc_stderr": 0.004763465139038559,
"acc_norm": 0.8415654252141008,
"acc_norm_stderr": 0.003644017383711596
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5111111111111111,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.5111111111111111,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5328947368421053,
"acc_stderr": 0.040601270352363966,
"acc_norm": 0.5328947368421053,
"acc_norm_stderr": 0.040601270352363966
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5924528301886792,
"acc_stderr": 0.030242233800854494,
"acc_norm": 0.5924528301886792,
"acc_norm_stderr": 0.030242233800854494
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6041666666666666,
"acc_stderr": 0.04089465449325582,
"acc_norm": 0.6041666666666666,
"acc_norm_stderr": 0.04089465449325582
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5317919075144508,
"acc_stderr": 0.03804749744364764,
"acc_norm": 0.5317919075144508,
"acc_norm_stderr": 0.03804749744364764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.04280105837364397,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.04280105837364397
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720685,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720685
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4723404255319149,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.4723404255319149,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159394,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159394
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.04164188720169375,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.04164188720169375
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3439153439153439,
"acc_stderr": 0.024464426625596433,
"acc_norm": 0.3439153439153439,
"acc_norm_stderr": 0.024464426625596433
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3492063492063492,
"acc_stderr": 0.04263906892795132,
"acc_norm": 0.3492063492063492,
"acc_norm_stderr": 0.04263906892795132
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6483870967741936,
"acc_stderr": 0.02716253782694846,
"acc_norm": 0.6483870967741936,
"acc_norm_stderr": 0.02716253782694846
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.43842364532019706,
"acc_stderr": 0.03491207857486518,
"acc_norm": 0.43842364532019706,
"acc_norm_stderr": 0.03491207857486518
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.0368105086916155,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.0368105086916155
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.702020202020202,
"acc_stderr": 0.03258630383836556,
"acc_norm": 0.702020202020202,
"acc_norm_stderr": 0.03258630383836556
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7979274611398963,
"acc_stderr": 0.028979089794296732,
"acc_norm": 0.7979274611398963,
"acc_norm_stderr": 0.028979089794296732
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5256410256410257,
"acc_stderr": 0.025317649726448663,
"acc_norm": 0.5256410256410257,
"acc_norm_stderr": 0.025317649726448663
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.028578348365473072,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.028578348365473072
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5966386554621849,
"acc_stderr": 0.031866081214088314,
"acc_norm": 0.5966386554621849,
"acc_norm_stderr": 0.031866081214088314
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7302752293577982,
"acc_stderr": 0.01902848671111544,
"acc_norm": 0.7302752293577982,
"acc_norm_stderr": 0.01902848671111544
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.41203703703703703,
"acc_stderr": 0.03356787758160835,
"acc_norm": 0.41203703703703703,
"acc_norm_stderr": 0.03356787758160835
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.029771775228145628,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.029771775228145628
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7637130801687764,
"acc_stderr": 0.02765215314415927,
"acc_norm": 0.7637130801687764,
"acc_norm_stderr": 0.02765215314415927
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.03138147637575499,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.03138147637575499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6183206106870229,
"acc_stderr": 0.04260735157644559,
"acc_norm": 0.6183206106870229,
"acc_norm_stderr": 0.04260735157644559
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04065578140908705,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04065578140908705
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6993865030674846,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.6993865030674846,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.36607142857142855,
"acc_stderr": 0.0457237235873743,
"acc_norm": 0.36607142857142855,
"acc_norm_stderr": 0.0457237235873743
},
"harness|hendrycksTest-management|5": {
"acc": 0.7087378640776699,
"acc_stderr": 0.044986763205729224,
"acc_norm": 0.7087378640776699,
"acc_norm_stderr": 0.044986763205729224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7948717948717948,
"acc_stderr": 0.026453508054040332,
"acc_norm": 0.7948717948717948,
"acc_norm_stderr": 0.026453508054040332
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7611749680715197,
"acc_stderr": 0.015246803197398675,
"acc_norm": 0.7611749680715197,
"acc_norm_stderr": 0.015246803197398675
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.025906632631016127,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.025906632631016127
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.45363128491620114,
"acc_stderr": 0.016650437588269073,
"acc_norm": 0.45363128491620114,
"acc_norm_stderr": 0.016650437588269073
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6143790849673203,
"acc_stderr": 0.02787074527829028,
"acc_norm": 0.6143790849673203,
"acc_norm_stderr": 0.02787074527829028
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6430868167202572,
"acc_stderr": 0.027210420375934023,
"acc_norm": 0.6430868167202572,
"acc_norm_stderr": 0.027210420375934023
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6203703703703703,
"acc_stderr": 0.027002521034516478,
"acc_norm": 0.6203703703703703,
"acc_norm_stderr": 0.027002521034516478
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.425531914893617,
"acc_stderr": 0.029494827600144373,
"acc_norm": 0.425531914893617,
"acc_norm_stderr": 0.029494827600144373
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.42503259452411996,
"acc_stderr": 0.012625879884891998,
"acc_norm": 0.42503259452411996,
"acc_norm_stderr": 0.012625879884891998
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5183823529411765,
"acc_stderr": 0.030352303395351964,
"acc_norm": 0.5183823529411765,
"acc_norm_stderr": 0.030352303395351964
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5816993464052288,
"acc_stderr": 0.019955975145835546,
"acc_norm": 0.5816993464052288,
"acc_norm_stderr": 0.019955975145835546
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6244897959183674,
"acc_stderr": 0.03100120903989484,
"acc_norm": 0.6244897959183674,
"acc_norm_stderr": 0.03100120903989484
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6865671641791045,
"acc_stderr": 0.032801882053486435,
"acc_norm": 0.6865671641791045,
"acc_norm_stderr": 0.032801882053486435
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-virology|5": {
"acc": 0.463855421686747,
"acc_stderr": 0.03882310850890594,
"acc_norm": 0.463855421686747,
"acc_norm_stderr": 0.03882310850890594
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7602339181286549,
"acc_stderr": 0.03274485211946956,
"acc_norm": 0.7602339181286549,
"acc_norm_stderr": 0.03274485211946956
},
"harness|truthfulqa:mc|0": {
"mc1": 0.36964504283965727,
"mc1_stderr": 0.016898180706973888,
"mc2": 0.5135116682163505,
"mc2_stderr": 0.015657648011440012
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_TheBloke__BigTranslate-13B-GPTQ | 2023-10-03T16:51:10.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of TheBloke/BigTranslate-13B-GPTQ
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TheBloke/BigTranslate-13B-GPTQ](https://huggingface.co/TheBloke/BigTranslate-13B-GPTQ)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__BigTranslate-13B-GPTQ\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-03T16:49:43.978313](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__BigTranslate-13B-GPTQ/blob/main/results_2023-10-03T16-49-43.978313.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3183234260279837,\n\
\ \"acc_stderr\": 0.03345911148648646,\n \"acc_norm\": 0.32162105858822604,\n\
\ \"acc_norm_stderr\": 0.0334500159442395,\n \"mc1\": 0.2558139534883721,\n\
\ \"mc1_stderr\": 0.015274176219283352,\n \"mc2\": 0.40595324855240134,\n\
\ \"mc2_stderr\": 0.014913531416143539\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4308873720136519,\n \"acc_stderr\": 0.01447113339264247,\n\
\ \"acc_norm\": 0.45307167235494883,\n \"acc_norm_stderr\": 0.01454689205200563\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5786695877315275,\n\
\ \"acc_stderr\": 0.004927631806477558,\n \"acc_norm\": 0.7510456084445329,\n\
\ \"acc_norm_stderr\": 0.004315236154543959\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2962962962962963,\n\
\ \"acc_stderr\": 0.03944624162501117,\n \"acc_norm\": 0.2962962962962963,\n\
\ \"acc_norm_stderr\": 0.03944624162501117\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.23026315789473684,\n \"acc_stderr\": 0.03426059424403165,\n\
\ \"acc_norm\": 0.23026315789473684,\n \"acc_norm_stderr\": 0.03426059424403165\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.4,\n\
\ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n \
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2830188679245283,\n \"acc_stderr\": 0.027724236492700904,\n\
\ \"acc_norm\": 0.2830188679245283,\n \"acc_norm_stderr\": 0.027724236492700904\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3263888888888889,\n\
\ \"acc_stderr\": 0.03921067198982266,\n \"acc_norm\": 0.3263888888888889,\n\
\ \"acc_norm_stderr\": 0.03921067198982266\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n\
\ \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2254335260115607,\n\
\ \"acc_stderr\": 0.03186209851641144,\n \"acc_norm\": 0.2254335260115607,\n\
\ \"acc_norm_stderr\": 0.03186209851641144\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171452,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171452\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n\
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3446808510638298,\n \"acc_stderr\": 0.03106898596312215,\n\
\ \"acc_norm\": 0.3446808510638298,\n \"acc_norm_stderr\": 0.03106898596312215\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.21929824561403508,\n\
\ \"acc_stderr\": 0.03892431106518753,\n \"acc_norm\": 0.21929824561403508,\n\
\ \"acc_norm_stderr\": 0.03892431106518753\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.03333333333333329,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.03333333333333329\n },\n\
\ \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.23544973544973544,\n\
\ \"acc_stderr\": 0.02185150982203172,\n \"acc_norm\": 0.23544973544973544,\n\
\ \"acc_norm_stderr\": 0.02185150982203172\n },\n \"harness|hendrycksTest-formal_logic|5\"\
: {\n \"acc\": 0.2619047619047619,\n \"acc_stderr\": 0.039325376803928724,\n\
\ \"acc_norm\": 0.2619047619047619,\n \"acc_norm_stderr\": 0.039325376803928724\n\
\ },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.23,\n\
\ \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-high_school_biology|5\"\
: {\n \"acc\": 0.26129032258064516,\n \"acc_stderr\": 0.024993053397764812,\n\
\ \"acc_norm\": 0.26129032258064516,\n \"acc_norm_stderr\": 0.024993053397764812\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.1625615763546798,\n \"acc_stderr\": 0.025960300064605576,\n \"\
acc_norm\": 0.1625615763546798,\n \"acc_norm_stderr\": 0.025960300064605576\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\"\
: 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.3515151515151515,\n \"acc_stderr\": 0.037282069986826503,\n\
\ \"acc_norm\": 0.3515151515151515,\n \"acc_norm_stderr\": 0.037282069986826503\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.23232323232323232,\n \"acc_stderr\": 0.030088629490217487,\n \"\
acc_norm\": 0.23232323232323232,\n \"acc_norm_stderr\": 0.030088629490217487\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.43523316062176165,\n \"acc_stderr\": 0.03578038165008585,\n\
\ \"acc_norm\": 0.43523316062176165,\n \"acc_norm_stderr\": 0.03578038165008585\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3487179487179487,\n \"acc_stderr\": 0.02416278028401772,\n \
\ \"acc_norm\": 0.3487179487179487,\n \"acc_norm_stderr\": 0.02416278028401772\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2962962962962963,\n \"acc_stderr\": 0.027840811495871927,\n \
\ \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.027840811495871927\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3277310924369748,\n \"acc_stderr\": 0.030489911417673227,\n\
\ \"acc_norm\": 0.3277310924369748,\n \"acc_norm_stderr\": 0.030489911417673227\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.25165562913907286,\n \"acc_stderr\": 0.035433042343899844,\n \"\
acc_norm\": 0.25165562913907286,\n \"acc_norm_stderr\": 0.035433042343899844\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.3559633027522936,\n \"acc_stderr\": 0.02052855927824422,\n \"\
acc_norm\": 0.3559633027522936,\n \"acc_norm_stderr\": 0.02052855927824422\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3472222222222222,\n \"acc_stderr\": 0.03246887243637649,\n \"\
acc_norm\": 0.3472222222222222,\n \"acc_norm_stderr\": 0.03246887243637649\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.3480392156862745,\n \"acc_stderr\": 0.03343311240488419,\n \"\
acc_norm\": 0.3480392156862745,\n \"acc_norm_stderr\": 0.03343311240488419\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.37130801687763715,\n \"acc_stderr\": 0.03145068600744859,\n \
\ \"acc_norm\": 0.37130801687763715,\n \"acc_norm_stderr\": 0.03145068600744859\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.42152466367713004,\n\
\ \"acc_stderr\": 0.033141902221106564,\n \"acc_norm\": 0.42152466367713004,\n\
\ \"acc_norm_stderr\": 0.033141902221106564\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.29770992366412213,\n \"acc_stderr\": 0.04010358942462202,\n\
\ \"acc_norm\": 0.29770992366412213,\n \"acc_norm_stderr\": 0.04010358942462202\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.49586776859504134,\n \"acc_stderr\": 0.045641987674327526,\n \"\
acc_norm\": 0.49586776859504134,\n \"acc_norm_stderr\": 0.045641987674327526\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.32407407407407407,\n\
\ \"acc_stderr\": 0.04524596007030048,\n \"acc_norm\": 0.32407407407407407,\n\
\ \"acc_norm_stderr\": 0.04524596007030048\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2147239263803681,\n \"acc_stderr\": 0.03226219377286774,\n\
\ \"acc_norm\": 0.2147239263803681,\n \"acc_norm_stderr\": 0.03226219377286774\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3392857142857143,\n\
\ \"acc_stderr\": 0.04493949068613539,\n \"acc_norm\": 0.3392857142857143,\n\
\ \"acc_norm_stderr\": 0.04493949068613539\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.30097087378640774,\n \"acc_stderr\": 0.04541609446503947,\n\
\ \"acc_norm\": 0.30097087378640774,\n \"acc_norm_stderr\": 0.04541609446503947\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.3803418803418803,\n\
\ \"acc_stderr\": 0.031804252043840985,\n \"acc_norm\": 0.3803418803418803,\n\
\ \"acc_norm_stderr\": 0.031804252043840985\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.4342273307790549,\n\
\ \"acc_stderr\": 0.01772458938967779,\n \"acc_norm\": 0.4342273307790549,\n\
\ \"acc_norm_stderr\": 0.01772458938967779\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2774566473988439,\n \"acc_stderr\": 0.024105712607754307,\n\
\ \"acc_norm\": 0.2774566473988439,\n \"acc_norm_stderr\": 0.024105712607754307\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2245810055865922,\n\
\ \"acc_stderr\": 0.01395680366654464,\n \"acc_norm\": 0.2245810055865922,\n\
\ \"acc_norm_stderr\": 0.01395680366654464\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2908496732026144,\n \"acc_stderr\": 0.026004800363952113,\n\
\ \"acc_norm\": 0.2908496732026144,\n \"acc_norm_stderr\": 0.026004800363952113\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3762057877813505,\n\
\ \"acc_stderr\": 0.027513925683549427,\n \"acc_norm\": 0.3762057877813505,\n\
\ \"acc_norm_stderr\": 0.027513925683549427\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2839506172839506,\n \"acc_stderr\": 0.025089478523765127,\n\
\ \"acc_norm\": 0.2839506172839506,\n \"acc_norm_stderr\": 0.025089478523765127\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.24468085106382978,\n \"acc_stderr\": 0.02564555362226673,\n \
\ \"acc_norm\": 0.24468085106382978,\n \"acc_norm_stderr\": 0.02564555362226673\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.27183833116036504,\n\
\ \"acc_stderr\": 0.011363135278651414,\n \"acc_norm\": 0.27183833116036504,\n\
\ \"acc_norm_stderr\": 0.011363135278651414\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4264705882352941,\n \"acc_stderr\": 0.030042615832714857,\n\
\ \"acc_norm\": 0.4264705882352941,\n \"acc_norm_stderr\": 0.030042615832714857\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.31209150326797386,\n \"acc_stderr\": 0.01874501120127766,\n \
\ \"acc_norm\": 0.31209150326797386,\n \"acc_norm_stderr\": 0.01874501120127766\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.32727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.32727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2693877551020408,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.2693877551020408,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.373134328358209,\n\
\ \"acc_stderr\": 0.034198326081760065,\n \"acc_norm\": 0.373134328358209,\n\
\ \"acc_norm_stderr\": 0.034198326081760065\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.29518072289156627,\n\
\ \"acc_stderr\": 0.03550920185689629,\n \"acc_norm\": 0.29518072289156627,\n\
\ \"acc_norm_stderr\": 0.03550920185689629\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.391812865497076,\n \"acc_stderr\": 0.037439798259264016,\n\
\ \"acc_norm\": 0.391812865497076,\n \"acc_norm_stderr\": 0.037439798259264016\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2558139534883721,\n\
\ \"mc1_stderr\": 0.015274176219283352,\n \"mc2\": 0.40595324855240134,\n\
\ \"mc2_stderr\": 0.014913531416143539\n }\n}\n```"
repo_url: https://huggingface.co/TheBloke/BigTranslate-13B-GPTQ
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_03T16_49_43.978313
path:
- '**/details_harness|arc:challenge|25_2023-10-03T16-49-43.978313.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-03T16-49-43.978313.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_03T16_49_43.978313
path:
- '**/details_harness|hellaswag|10_2023-10-03T16-49-43.978313.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-03T16-49-43.978313.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_03T16_49_43.978313
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T16-49-43.978313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T16-49-43.978313.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T16-49-43.978313.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_03T16_49_43.978313
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T16-49-43.978313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T16-49-43.978313.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_03T16_49_43.978313
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T16-49-43.978313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T16-49-43.978313.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_03T16_49_43.978313
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T16-49-43.978313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T16-49-43.978313.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_03T16_49_43.978313
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T16-49-43.978313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T16-49-43.978313.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_03T16_49_43.978313
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T16-49-43.978313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T16-49-43.978313.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_03T16_49_43.978313
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T16-49-43.978313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T16-49-43.978313.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_03T16_49_43.978313
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T16-49-43.978313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T16-49-43.978313.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_03T16_49_43.978313
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T16-49-43.978313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T16-49-43.978313.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_03T16_49_43.978313
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T16-49-43.978313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T16-49-43.978313.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_03T16_49_43.978313
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T16-49-43.978313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T16-49-43.978313.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_03T16_49_43.978313
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T16-49-43.978313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T16-49-43.978313.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_03T16_49_43.978313
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T16-49-43.978313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T16-49-43.978313.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_03T16_49_43.978313
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T16-49-43.978313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T16-49-43.978313.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_03T16_49_43.978313
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T16-49-43.978313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T16-49-43.978313.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_03T16_49_43.978313
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T16-49-43.978313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T16-49-43.978313.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_03T16_49_43.978313
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T16-49-43.978313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T16-49-43.978313.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_03T16_49_43.978313
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T16-49-43.978313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T16-49-43.978313.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_03T16_49_43.978313
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T16-49-43.978313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T16-49-43.978313.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_03T16_49_43.978313
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T16-49-43.978313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T16-49-43.978313.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_03T16_49_43.978313
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T16-49-43.978313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T16-49-43.978313.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_03T16_49_43.978313
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T16-49-43.978313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T16-49-43.978313.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_03T16_49_43.978313
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T16-49-43.978313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T16-49-43.978313.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_03T16_49_43.978313
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T16-49-43.978313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T16-49-43.978313.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_03T16_49_43.978313
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T16-49-43.978313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T16-49-43.978313.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_03T16_49_43.978313
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T16-49-43.978313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T16-49-43.978313.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_03T16_49_43.978313
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T16-49-43.978313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T16-49-43.978313.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_03T16_49_43.978313
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T16-49-43.978313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T16-49-43.978313.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_03T16_49_43.978313
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T16-49-43.978313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T16-49-43.978313.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_03T16_49_43.978313
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T16-49-43.978313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T16-49-43.978313.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_03T16_49_43.978313
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T16-49-43.978313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T16-49-43.978313.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_03T16_49_43.978313
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T16-49-43.978313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T16-49-43.978313.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_03T16_49_43.978313
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T16-49-43.978313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T16-49-43.978313.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_03T16_49_43.978313
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T16-49-43.978313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T16-49-43.978313.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_03T16_49_43.978313
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T16-49-43.978313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T16-49-43.978313.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_03T16_49_43.978313
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T16-49-43.978313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T16-49-43.978313.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_03T16_49_43.978313
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T16-49-43.978313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T16-49-43.978313.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_03T16_49_43.978313
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T16-49-43.978313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T16-49-43.978313.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_03T16_49_43.978313
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T16-49-43.978313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T16-49-43.978313.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_03T16_49_43.978313
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T16-49-43.978313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T16-49-43.978313.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_03T16_49_43.978313
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T16-49-43.978313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T16-49-43.978313.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_03T16_49_43.978313
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T16-49-43.978313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T16-49-43.978313.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_03T16_49_43.978313
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T16-49-43.978313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T16-49-43.978313.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_03T16_49_43.978313
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T16-49-43.978313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T16-49-43.978313.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_03T16_49_43.978313
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T16-49-43.978313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T16-49-43.978313.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_03T16_49_43.978313
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T16-49-43.978313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T16-49-43.978313.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_03T16_49_43.978313
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T16-49-43.978313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T16-49-43.978313.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_03T16_49_43.978313
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T16-49-43.978313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T16-49-43.978313.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_03T16_49_43.978313
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T16-49-43.978313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T16-49-43.978313.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_03T16_49_43.978313
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T16-49-43.978313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T16-49-43.978313.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_03T16_49_43.978313
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T16-49-43.978313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T16-49-43.978313.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_03T16_49_43.978313
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T16-49-43.978313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T16-49-43.978313.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_03T16_49_43.978313
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T16-49-43.978313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T16-49-43.978313.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_03T16_49_43.978313
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T16-49-43.978313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T16-49-43.978313.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_03T16_49_43.978313
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T16-49-43.978313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T16-49-43.978313.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_03T16_49_43.978313
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T16-49-43.978313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T16-49-43.978313.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_03T16_49_43.978313
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T16-49-43.978313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T16-49-43.978313.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_03T16_49_43.978313
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T16-49-43.978313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T16-49-43.978313.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_03T16_49_43.978313
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T16-49-43.978313.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T16-49-43.978313.parquet'
- config_name: results
data_files:
- split: 2023_10_03T16_49_43.978313
path:
- results_2023-10-03T16-49-43.978313.parquet
- split: latest
path:
- results_2023-10-03T16-49-43.978313.parquet
---
# Dataset Card for Evaluation run of TheBloke/BigTranslate-13B-GPTQ
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/BigTranslate-13B-GPTQ
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/BigTranslate-13B-GPTQ](https://huggingface.co/TheBloke/BigTranslate-13B-GPTQ) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__BigTranslate-13B-GPTQ",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-03T16:49:43.978313](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__BigTranslate-13B-GPTQ/blob/main/results_2023-10-03T16-49-43.978313.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.3183234260279837,
"acc_stderr": 0.03345911148648646,
"acc_norm": 0.32162105858822604,
"acc_norm_stderr": 0.0334500159442395,
"mc1": 0.2558139534883721,
"mc1_stderr": 0.015274176219283352,
"mc2": 0.40595324855240134,
"mc2_stderr": 0.014913531416143539
},
"harness|arc:challenge|25": {
"acc": 0.4308873720136519,
"acc_stderr": 0.01447113339264247,
"acc_norm": 0.45307167235494883,
"acc_norm_stderr": 0.01454689205200563
},
"harness|hellaswag|10": {
"acc": 0.5786695877315275,
"acc_stderr": 0.004927631806477558,
"acc_norm": 0.7510456084445329,
"acc_norm_stderr": 0.004315236154543959
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.03944624162501117,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.03944624162501117
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.23026315789473684,
"acc_stderr": 0.03426059424403165,
"acc_norm": 0.23026315789473684,
"acc_norm_stderr": 0.03426059424403165
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2830188679245283,
"acc_stderr": 0.027724236492700904,
"acc_norm": 0.2830188679245283,
"acc_norm_stderr": 0.027724236492700904
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3263888888888889,
"acc_stderr": 0.03921067198982266,
"acc_norm": 0.3263888888888889,
"acc_norm_stderr": 0.03921067198982266
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2254335260115607,
"acc_stderr": 0.03186209851641144,
"acc_norm": 0.2254335260115607,
"acc_norm_stderr": 0.03186209851641144
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.04220773659171452,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.04220773659171452
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3446808510638298,
"acc_stderr": 0.03106898596312215,
"acc_norm": 0.3446808510638298,
"acc_norm_stderr": 0.03106898596312215
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.21929824561403508,
"acc_stderr": 0.03892431106518753,
"acc_norm": 0.21929824561403508,
"acc_norm_stderr": 0.03892431106518753
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2,
"acc_stderr": 0.03333333333333329,
"acc_norm": 0.2,
"acc_norm_stderr": 0.03333333333333329
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.23544973544973544,
"acc_stderr": 0.02185150982203172,
"acc_norm": 0.23544973544973544,
"acc_norm_stderr": 0.02185150982203172
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.039325376803928724,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.039325376803928724
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.26129032258064516,
"acc_stderr": 0.024993053397764812,
"acc_norm": 0.26129032258064516,
"acc_norm_stderr": 0.024993053397764812
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.1625615763546798,
"acc_stderr": 0.025960300064605576,
"acc_norm": 0.1625615763546798,
"acc_norm_stderr": 0.025960300064605576
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.3515151515151515,
"acc_stderr": 0.037282069986826503,
"acc_norm": 0.3515151515151515,
"acc_norm_stderr": 0.037282069986826503
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.23232323232323232,
"acc_stderr": 0.030088629490217487,
"acc_norm": 0.23232323232323232,
"acc_norm_stderr": 0.030088629490217487
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.43523316062176165,
"acc_stderr": 0.03578038165008585,
"acc_norm": 0.43523316062176165,
"acc_norm_stderr": 0.03578038165008585
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3487179487179487,
"acc_stderr": 0.02416278028401772,
"acc_norm": 0.3487179487179487,
"acc_norm_stderr": 0.02416278028401772
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.027840811495871927,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.027840811495871927
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3277310924369748,
"acc_stderr": 0.030489911417673227,
"acc_norm": 0.3277310924369748,
"acc_norm_stderr": 0.030489911417673227
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.25165562913907286,
"acc_stderr": 0.035433042343899844,
"acc_norm": 0.25165562913907286,
"acc_norm_stderr": 0.035433042343899844
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3559633027522936,
"acc_stderr": 0.02052855927824422,
"acc_norm": 0.3559633027522936,
"acc_norm_stderr": 0.02052855927824422
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3472222222222222,
"acc_stderr": 0.03246887243637649,
"acc_norm": 0.3472222222222222,
"acc_norm_stderr": 0.03246887243637649
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.3480392156862745,
"acc_stderr": 0.03343311240488419,
"acc_norm": 0.3480392156862745,
"acc_norm_stderr": 0.03343311240488419
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.37130801687763715,
"acc_stderr": 0.03145068600744859,
"acc_norm": 0.37130801687763715,
"acc_norm_stderr": 0.03145068600744859
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.42152466367713004,
"acc_stderr": 0.033141902221106564,
"acc_norm": 0.42152466367713004,
"acc_norm_stderr": 0.033141902221106564
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.29770992366412213,
"acc_stderr": 0.04010358942462202,
"acc_norm": 0.29770992366412213,
"acc_norm_stderr": 0.04010358942462202
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.49586776859504134,
"acc_stderr": 0.045641987674327526,
"acc_norm": 0.49586776859504134,
"acc_norm_stderr": 0.045641987674327526
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.32407407407407407,
"acc_stderr": 0.04524596007030048,
"acc_norm": 0.32407407407407407,
"acc_norm_stderr": 0.04524596007030048
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2147239263803681,
"acc_stderr": 0.03226219377286774,
"acc_norm": 0.2147239263803681,
"acc_norm_stderr": 0.03226219377286774
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3392857142857143,
"acc_stderr": 0.04493949068613539,
"acc_norm": 0.3392857142857143,
"acc_norm_stderr": 0.04493949068613539
},
"harness|hendrycksTest-management|5": {
"acc": 0.30097087378640774,
"acc_stderr": 0.04541609446503947,
"acc_norm": 0.30097087378640774,
"acc_norm_stderr": 0.04541609446503947
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.3803418803418803,
"acc_stderr": 0.031804252043840985,
"acc_norm": 0.3803418803418803,
"acc_norm_stderr": 0.031804252043840985
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.4342273307790549,
"acc_stderr": 0.01772458938967779,
"acc_norm": 0.4342273307790549,
"acc_norm_stderr": 0.01772458938967779
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2774566473988439,
"acc_stderr": 0.024105712607754307,
"acc_norm": 0.2774566473988439,
"acc_norm_stderr": 0.024105712607754307
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2245810055865922,
"acc_stderr": 0.01395680366654464,
"acc_norm": 0.2245810055865922,
"acc_norm_stderr": 0.01395680366654464
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2908496732026144,
"acc_stderr": 0.026004800363952113,
"acc_norm": 0.2908496732026144,
"acc_norm_stderr": 0.026004800363952113
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3762057877813505,
"acc_stderr": 0.027513925683549427,
"acc_norm": 0.3762057877813505,
"acc_norm_stderr": 0.027513925683549427
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2839506172839506,
"acc_stderr": 0.025089478523765127,
"acc_norm": 0.2839506172839506,
"acc_norm_stderr": 0.025089478523765127
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.24468085106382978,
"acc_stderr": 0.02564555362226673,
"acc_norm": 0.24468085106382978,
"acc_norm_stderr": 0.02564555362226673
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.27183833116036504,
"acc_stderr": 0.011363135278651414,
"acc_norm": 0.27183833116036504,
"acc_norm_stderr": 0.011363135278651414
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4264705882352941,
"acc_stderr": 0.030042615832714857,
"acc_norm": 0.4264705882352941,
"acc_norm_stderr": 0.030042615832714857
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.31209150326797386,
"acc_stderr": 0.01874501120127766,
"acc_norm": 0.31209150326797386,
"acc_norm_stderr": 0.01874501120127766
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.32727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.32727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2693877551020408,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.2693877551020408,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.373134328358209,
"acc_stderr": 0.034198326081760065,
"acc_norm": 0.373134328358209,
"acc_norm_stderr": 0.034198326081760065
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-virology|5": {
"acc": 0.29518072289156627,
"acc_stderr": 0.03550920185689629,
"acc_norm": 0.29518072289156627,
"acc_norm_stderr": 0.03550920185689629
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.391812865497076,
"acc_stderr": 0.037439798259264016,
"acc_norm": 0.391812865497076,
"acc_norm_stderr": 0.037439798259264016
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2558139534883721,
"mc1_stderr": 0.015274176219283352,
"mc2": 0.40595324855240134,
"mc2_stderr": 0.014913531416143539
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
atom-in-the-universe/bild-e9b55032-078e-4601-8be5-1356dba9e781 | 2023-10-03T16:56:07.000Z | [
"region:us"
] | atom-in-the-universe | null | null | null | 0 | 0 | Entry not found |
atom-in-the-universe/bild-9e3becd3-5b41-4220-8397-ac9b836af2d7 | 2023-10-03T17:01:08.000Z | [
"region:us"
] | atom-in-the-universe | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_dhmeltzer__Llama-2-13b-hf-ds_wiki_1024_full_r_64_alpha_16_merged | 2023-10-03T17:05:22.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of dhmeltzer/Llama-2-13b-hf-ds_wiki_1024_full_r_64_alpha_16_merged
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [dhmeltzer/Llama-2-13b-hf-ds_wiki_1024_full_r_64_alpha_16_merged](https://huggingface.co/dhmeltzer/Llama-2-13b-hf-ds_wiki_1024_full_r_64_alpha_16_merged)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_dhmeltzer__Llama-2-13b-hf-ds_wiki_1024_full_r_64_alpha_16_merged\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-03T17:03:59.314428](https://huggingface.co/datasets/open-llm-leaderboard/details_dhmeltzer__Llama-2-13b-hf-ds_wiki_1024_full_r_64_alpha_16_merged/blob/main/results_2023-10-03T17-03-59.314428.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5509228770970946,\n\
\ \"acc_stderr\": 0.034449384118189326,\n \"acc_norm\": 0.5553135730975924,\n\
\ \"acc_norm_stderr\": 0.03442896357458051,\n \"mc1\": 0.23623011015911874,\n\
\ \"mc1_stderr\": 0.014869755015871112,\n \"mc2\": 0.35847150784146525,\n\
\ \"mc2_stderr\": 0.013573219737109765\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5358361774744027,\n \"acc_stderr\": 0.014573813664735718,\n\
\ \"acc_norm\": 0.5844709897610921,\n \"acc_norm_stderr\": 0.014401366641216383\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6092411870145389,\n\
\ \"acc_stderr\": 0.004869232758103326,\n \"acc_norm\": 0.8196574387572196,\n\
\ \"acc_norm_stderr\": 0.003836867708701991\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4962962962962963,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.4962962962962963,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5131578947368421,\n \"acc_stderr\": 0.04067533136309173,\n\
\ \"acc_norm\": 0.5131578947368421,\n \"acc_norm_stderr\": 0.04067533136309173\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6150943396226415,\n \"acc_stderr\": 0.02994649856769995,\n\
\ \"acc_norm\": 0.6150943396226415,\n \"acc_norm_stderr\": 0.02994649856769995\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5625,\n\
\ \"acc_stderr\": 0.04148415739394154,\n \"acc_norm\": 0.5625,\n \
\ \"acc_norm_stderr\": 0.04148415739394154\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n\
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5144508670520231,\n\
\ \"acc_stderr\": 0.03810871630454764,\n \"acc_norm\": 0.5144508670520231,\n\
\ \"acc_norm_stderr\": 0.03810871630454764\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\": 0.67,\n\
\ \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.44680851063829785,\n \"acc_stderr\": 0.0325005368436584,\n\
\ \"acc_norm\": 0.44680851063829785,\n \"acc_norm_stderr\": 0.0325005368436584\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.32456140350877194,\n\
\ \"acc_stderr\": 0.04404556157374767,\n \"acc_norm\": 0.32456140350877194,\n\
\ \"acc_norm_stderr\": 0.04404556157374767\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.32275132275132273,\n \"acc_stderr\": 0.024078943243597016,\n \"\
acc_norm\": 0.32275132275132273,\n \"acc_norm_stderr\": 0.024078943243597016\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n\
\ \"acc_stderr\": 0.04306241259127153,\n \"acc_norm\": 0.36507936507936506,\n\
\ \"acc_norm_stderr\": 0.04306241259127153\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6645161290322581,\n\
\ \"acc_stderr\": 0.026860206444724345,\n \"acc_norm\": 0.6645161290322581,\n\
\ \"acc_norm_stderr\": 0.026860206444724345\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4187192118226601,\n \"acc_stderr\": 0.03471192860518468,\n\
\ \"acc_norm\": 0.4187192118226601,\n \"acc_norm_stderr\": 0.03471192860518468\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.03713158067481913,\n\
\ \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.03713158067481913\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6565656565656566,\n \"acc_stderr\": 0.03383201223244441,\n \"\
acc_norm\": 0.6565656565656566,\n \"acc_norm_stderr\": 0.03383201223244441\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8082901554404145,\n \"acc_stderr\": 0.02840895362624527,\n\
\ \"acc_norm\": 0.8082901554404145,\n \"acc_norm_stderr\": 0.02840895362624527\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5307692307692308,\n \"acc_stderr\": 0.025302958890850158,\n\
\ \"acc_norm\": 0.5307692307692308,\n \"acc_norm_stderr\": 0.025302958890850158\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3,\n \"acc_stderr\": 0.027940457136228416,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.027940457136228416\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.5672268907563025,\n \"acc_stderr\": 0.032183581077426124,\n\
\ \"acc_norm\": 0.5672268907563025,\n \"acc_norm_stderr\": 0.032183581077426124\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7614678899082569,\n \"acc_stderr\": 0.018272575810231867,\n \"\
acc_norm\": 0.7614678899082569,\n \"acc_norm_stderr\": 0.018272575810231867\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.44907407407407407,\n \"acc_stderr\": 0.03392238405321616,\n \"\
acc_norm\": 0.44907407407407407,\n \"acc_norm_stderr\": 0.03392238405321616\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7450980392156863,\n \"acc_stderr\": 0.03058759135160425,\n \"\
acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.03058759135160425\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7088607594936709,\n \"acc_stderr\": 0.02957160106575337,\n \
\ \"acc_norm\": 0.7088607594936709,\n \"acc_norm_stderr\": 0.02957160106575337\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6412556053811659,\n\
\ \"acc_stderr\": 0.032190792004199956,\n \"acc_norm\": 0.6412556053811659,\n\
\ \"acc_norm_stderr\": 0.032190792004199956\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.648854961832061,\n \"acc_stderr\": 0.04186445163013751,\n\
\ \"acc_norm\": 0.648854961832061,\n \"acc_norm_stderr\": 0.04186445163013751\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6859504132231405,\n \"acc_stderr\": 0.04236964753041018,\n \"\
acc_norm\": 0.6859504132231405,\n \"acc_norm_stderr\": 0.04236964753041018\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.04557239513497751,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.04557239513497751\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6748466257668712,\n \"acc_stderr\": 0.036803503712864595,\n\
\ \"acc_norm\": 0.6748466257668712,\n \"acc_norm_stderr\": 0.036803503712864595\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.26785714285714285,\n\
\ \"acc_stderr\": 0.04203277291467762,\n \"acc_norm\": 0.26785714285714285,\n\
\ \"acc_norm_stderr\": 0.04203277291467762\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260595,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260595\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7991452991452992,\n\
\ \"acc_stderr\": 0.02624677294689048,\n \"acc_norm\": 0.7991452991452992,\n\
\ \"acc_norm_stderr\": 0.02624677294689048\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7445721583652618,\n\
\ \"acc_stderr\": 0.015594955384455766,\n \"acc_norm\": 0.7445721583652618,\n\
\ \"acc_norm_stderr\": 0.015594955384455766\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.630057803468208,\n \"acc_stderr\": 0.02599247202930639,\n\
\ \"acc_norm\": 0.630057803468208,\n \"acc_norm_stderr\": 0.02599247202930639\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.28938547486033517,\n\
\ \"acc_stderr\": 0.015166544550490314,\n \"acc_norm\": 0.28938547486033517,\n\
\ \"acc_norm_stderr\": 0.015166544550490314\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5784313725490197,\n \"acc_stderr\": 0.028275490156791462,\n\
\ \"acc_norm\": 0.5784313725490197,\n \"acc_norm_stderr\": 0.028275490156791462\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6559485530546624,\n\
\ \"acc_stderr\": 0.026981478043648043,\n \"acc_norm\": 0.6559485530546624,\n\
\ \"acc_norm_stderr\": 0.026981478043648043\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6450617283950617,\n \"acc_stderr\": 0.02662415247884585,\n\
\ \"acc_norm\": 0.6450617283950617,\n \"acc_norm_stderr\": 0.02662415247884585\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.41134751773049644,\n \"acc_stderr\": 0.029354911159940985,\n \
\ \"acc_norm\": 0.41134751773049644,\n \"acc_norm_stderr\": 0.029354911159940985\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41590612777053454,\n\
\ \"acc_stderr\": 0.012588323850313617,\n \"acc_norm\": 0.41590612777053454,\n\
\ \"acc_norm_stderr\": 0.012588323850313617\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5477941176470589,\n \"acc_stderr\": 0.030233758551596438,\n\
\ \"acc_norm\": 0.5477941176470589,\n \"acc_norm_stderr\": 0.030233758551596438\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5424836601307189,\n \"acc_stderr\": 0.020154685712590888,\n \
\ \"acc_norm\": 0.5424836601307189,\n \"acc_norm_stderr\": 0.020154685712590888\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6285714285714286,\n \"acc_stderr\": 0.03093285879278985,\n\
\ \"acc_norm\": 0.6285714285714286,\n \"acc_norm_stderr\": 0.03093285879278985\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7313432835820896,\n\
\ \"acc_stderr\": 0.03134328358208954,\n \"acc_norm\": 0.7313432835820896,\n\
\ \"acc_norm_stderr\": 0.03134328358208954\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366255,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366255\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.463855421686747,\n\
\ \"acc_stderr\": 0.03882310850890593,\n \"acc_norm\": 0.463855421686747,\n\
\ \"acc_norm_stderr\": 0.03882310850890593\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.031581495393387324,\n\
\ \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.031581495393387324\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23623011015911874,\n\
\ \"mc1_stderr\": 0.014869755015871112,\n \"mc2\": 0.35847150784146525,\n\
\ \"mc2_stderr\": 0.013573219737109765\n }\n}\n```"
repo_url: https://huggingface.co/dhmeltzer/Llama-2-13b-hf-ds_wiki_1024_full_r_64_alpha_16_merged
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_03T17_03_59.314428
path:
- '**/details_harness|arc:challenge|25_2023-10-03T17-03-59.314428.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-03T17-03-59.314428.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_03T17_03_59.314428
path:
- '**/details_harness|hellaswag|10_2023-10-03T17-03-59.314428.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-03T17-03-59.314428.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_03T17_03_59.314428
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-03-59.314428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-03-59.314428.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-03-59.314428.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_03T17_03_59.314428
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-03-59.314428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-03-59.314428.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_03T17_03_59.314428
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-03-59.314428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-03-59.314428.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_03T17_03_59.314428
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-03-59.314428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-03-59.314428.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_03T17_03_59.314428
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-03-59.314428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-03-59.314428.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_03T17_03_59.314428
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-03-59.314428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-03-59.314428.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_03T17_03_59.314428
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-03-59.314428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-03-59.314428.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_03T17_03_59.314428
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-03-59.314428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-03-59.314428.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_03T17_03_59.314428
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-03-59.314428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-03-59.314428.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_03T17_03_59.314428
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-03-59.314428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-03-59.314428.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_03T17_03_59.314428
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-03-59.314428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-03-59.314428.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_03T17_03_59.314428
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-03-59.314428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-03-59.314428.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_03T17_03_59.314428
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-03-59.314428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-03-59.314428.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_03T17_03_59.314428
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-03-59.314428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-03-59.314428.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_03T17_03_59.314428
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-03-59.314428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-03-59.314428.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_03T17_03_59.314428
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-03-59.314428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-03-59.314428.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_03T17_03_59.314428
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-03-59.314428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-03-59.314428.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_03T17_03_59.314428
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-03-59.314428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-03-59.314428.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_03T17_03_59.314428
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-03-59.314428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-03-59.314428.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_03T17_03_59.314428
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-03-59.314428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-03-59.314428.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_03T17_03_59.314428
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-03-59.314428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-03-59.314428.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_03T17_03_59.314428
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-03-59.314428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-03-59.314428.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_03T17_03_59.314428
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-03-59.314428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-03-59.314428.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_03T17_03_59.314428
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-03-59.314428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-03-59.314428.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_03T17_03_59.314428
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-03-59.314428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-03-59.314428.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_03T17_03_59.314428
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-03-59.314428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-03-59.314428.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_03T17_03_59.314428
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-03-59.314428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-03-59.314428.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_03T17_03_59.314428
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-03-59.314428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-03-59.314428.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_03T17_03_59.314428
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-03-59.314428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-03-59.314428.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_03T17_03_59.314428
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-03-59.314428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-03-59.314428.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_03T17_03_59.314428
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-03-59.314428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-03-59.314428.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_03T17_03_59.314428
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-03-59.314428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-03-59.314428.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_03T17_03_59.314428
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-03-59.314428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-03-59.314428.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_03T17_03_59.314428
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-03-59.314428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-03-59.314428.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_03T17_03_59.314428
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-03-59.314428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-03-59.314428.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_03T17_03_59.314428
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-03-59.314428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-03-59.314428.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_03T17_03_59.314428
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-03-59.314428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-03-59.314428.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_03T17_03_59.314428
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-03-59.314428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-03-59.314428.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_03T17_03_59.314428
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-03-59.314428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-03-59.314428.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_03T17_03_59.314428
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T17-03-59.314428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T17-03-59.314428.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_03T17_03_59.314428
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-03-59.314428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-03-59.314428.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_03T17_03_59.314428
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-03-59.314428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-03-59.314428.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_03T17_03_59.314428
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-03-59.314428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-03-59.314428.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_03T17_03_59.314428
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-03-59.314428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-03-59.314428.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_03T17_03_59.314428
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-03-59.314428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-03-59.314428.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_03T17_03_59.314428
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-03-59.314428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-03-59.314428.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_03T17_03_59.314428
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-03-59.314428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-03-59.314428.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_03T17_03_59.314428
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-03-59.314428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-03-59.314428.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_03T17_03_59.314428
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-03-59.314428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-03-59.314428.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_03T17_03_59.314428
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-03-59.314428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-03-59.314428.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_03T17_03_59.314428
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-03-59.314428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-03-59.314428.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_03T17_03_59.314428
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-03-59.314428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-03-59.314428.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_03T17_03_59.314428
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-03-59.314428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-03-59.314428.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_03T17_03_59.314428
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-03-59.314428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-03-59.314428.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_03T17_03_59.314428
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-03-59.314428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-03-59.314428.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_03T17_03_59.314428
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-03-59.314428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-03-59.314428.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_03T17_03_59.314428
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-03-59.314428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-03-59.314428.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_03T17_03_59.314428
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-03-59.314428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-03-59.314428.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_03T17_03_59.314428
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T17-03-59.314428.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T17-03-59.314428.parquet'
- config_name: results
data_files:
- split: 2023_10_03T17_03_59.314428
path:
- results_2023-10-03T17-03-59.314428.parquet
- split: latest
path:
- results_2023-10-03T17-03-59.314428.parquet
---
# Dataset Card for Evaluation run of dhmeltzer/Llama-2-13b-hf-ds_wiki_1024_full_r_64_alpha_16_merged
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/dhmeltzer/Llama-2-13b-hf-ds_wiki_1024_full_r_64_alpha_16_merged
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [dhmeltzer/Llama-2-13b-hf-ds_wiki_1024_full_r_64_alpha_16_merged](https://huggingface.co/dhmeltzer/Llama-2-13b-hf-ds_wiki_1024_full_r_64_alpha_16_merged) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_dhmeltzer__Llama-2-13b-hf-ds_wiki_1024_full_r_64_alpha_16_merged",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-03T17:03:59.314428](https://huggingface.co/datasets/open-llm-leaderboard/details_dhmeltzer__Llama-2-13b-hf-ds_wiki_1024_full_r_64_alpha_16_merged/blob/main/results_2023-10-03T17-03-59.314428.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5509228770970946,
"acc_stderr": 0.034449384118189326,
"acc_norm": 0.5553135730975924,
"acc_norm_stderr": 0.03442896357458051,
"mc1": 0.23623011015911874,
"mc1_stderr": 0.014869755015871112,
"mc2": 0.35847150784146525,
"mc2_stderr": 0.013573219737109765
},
"harness|arc:challenge|25": {
"acc": 0.5358361774744027,
"acc_stderr": 0.014573813664735718,
"acc_norm": 0.5844709897610921,
"acc_norm_stderr": 0.014401366641216383
},
"harness|hellaswag|10": {
"acc": 0.6092411870145389,
"acc_stderr": 0.004869232758103326,
"acc_norm": 0.8196574387572196,
"acc_norm_stderr": 0.003836867708701991
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4962962962962963,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.4962962962962963,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5131578947368421,
"acc_stderr": 0.04067533136309173,
"acc_norm": 0.5131578947368421,
"acc_norm_stderr": 0.04067533136309173
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6150943396226415,
"acc_stderr": 0.02994649856769995,
"acc_norm": 0.6150943396226415,
"acc_norm_stderr": 0.02994649856769995
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5625,
"acc_stderr": 0.04148415739394154,
"acc_norm": 0.5625,
"acc_norm_stderr": 0.04148415739394154
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5144508670520231,
"acc_stderr": 0.03810871630454764,
"acc_norm": 0.5144508670520231,
"acc_norm_stderr": 0.03810871630454764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.44680851063829785,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.44680851063829785,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.32456140350877194,
"acc_stderr": 0.04404556157374767,
"acc_norm": 0.32456140350877194,
"acc_norm_stderr": 0.04404556157374767
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.32275132275132273,
"acc_stderr": 0.024078943243597016,
"acc_norm": 0.32275132275132273,
"acc_norm_stderr": 0.024078943243597016
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.04306241259127153,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.04306241259127153
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6645161290322581,
"acc_stderr": 0.026860206444724345,
"acc_norm": 0.6645161290322581,
"acc_norm_stderr": 0.026860206444724345
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4187192118226601,
"acc_stderr": 0.03471192860518468,
"acc_norm": 0.4187192118226601,
"acc_norm_stderr": 0.03471192860518468
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.03713158067481913,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.03713158067481913
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6565656565656566,
"acc_stderr": 0.03383201223244441,
"acc_norm": 0.6565656565656566,
"acc_norm_stderr": 0.03383201223244441
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8082901554404145,
"acc_stderr": 0.02840895362624527,
"acc_norm": 0.8082901554404145,
"acc_norm_stderr": 0.02840895362624527
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5307692307692308,
"acc_stderr": 0.025302958890850158,
"acc_norm": 0.5307692307692308,
"acc_norm_stderr": 0.025302958890850158
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.027940457136228416,
"acc_norm": 0.3,
"acc_norm_stderr": 0.027940457136228416
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5672268907563025,
"acc_stderr": 0.032183581077426124,
"acc_norm": 0.5672268907563025,
"acc_norm_stderr": 0.032183581077426124
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7614678899082569,
"acc_stderr": 0.018272575810231867,
"acc_norm": 0.7614678899082569,
"acc_norm_stderr": 0.018272575810231867
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.44907407407407407,
"acc_stderr": 0.03392238405321616,
"acc_norm": 0.44907407407407407,
"acc_norm_stderr": 0.03392238405321616
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.03058759135160425,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.03058759135160425
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7088607594936709,
"acc_stderr": 0.02957160106575337,
"acc_norm": 0.7088607594936709,
"acc_norm_stderr": 0.02957160106575337
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6412556053811659,
"acc_stderr": 0.032190792004199956,
"acc_norm": 0.6412556053811659,
"acc_norm_stderr": 0.032190792004199956
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.648854961832061,
"acc_stderr": 0.04186445163013751,
"acc_norm": 0.648854961832061,
"acc_norm_stderr": 0.04186445163013751
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6859504132231405,
"acc_stderr": 0.04236964753041018,
"acc_norm": 0.6859504132231405,
"acc_norm_stderr": 0.04236964753041018
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04557239513497751,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04557239513497751
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6748466257668712,
"acc_stderr": 0.036803503712864595,
"acc_norm": 0.6748466257668712,
"acc_norm_stderr": 0.036803503712864595
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.26785714285714285,
"acc_stderr": 0.04203277291467762,
"acc_norm": 0.26785714285714285,
"acc_norm_stderr": 0.04203277291467762
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.04354631077260595,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.04354631077260595
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7991452991452992,
"acc_stderr": 0.02624677294689048,
"acc_norm": 0.7991452991452992,
"acc_norm_stderr": 0.02624677294689048
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7445721583652618,
"acc_stderr": 0.015594955384455766,
"acc_norm": 0.7445721583652618,
"acc_norm_stderr": 0.015594955384455766
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.02599247202930639,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.02599247202930639
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.28938547486033517,
"acc_stderr": 0.015166544550490314,
"acc_norm": 0.28938547486033517,
"acc_norm_stderr": 0.015166544550490314
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5784313725490197,
"acc_stderr": 0.028275490156791462,
"acc_norm": 0.5784313725490197,
"acc_norm_stderr": 0.028275490156791462
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6559485530546624,
"acc_stderr": 0.026981478043648043,
"acc_norm": 0.6559485530546624,
"acc_norm_stderr": 0.026981478043648043
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6450617283950617,
"acc_stderr": 0.02662415247884585,
"acc_norm": 0.6450617283950617,
"acc_norm_stderr": 0.02662415247884585
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.41134751773049644,
"acc_stderr": 0.029354911159940985,
"acc_norm": 0.41134751773049644,
"acc_norm_stderr": 0.029354911159940985
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.41590612777053454,
"acc_stderr": 0.012588323850313617,
"acc_norm": 0.41590612777053454,
"acc_norm_stderr": 0.012588323850313617
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5477941176470589,
"acc_stderr": 0.030233758551596438,
"acc_norm": 0.5477941176470589,
"acc_norm_stderr": 0.030233758551596438
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5424836601307189,
"acc_stderr": 0.020154685712590888,
"acc_norm": 0.5424836601307189,
"acc_norm_stderr": 0.020154685712590888
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6285714285714286,
"acc_stderr": 0.03093285879278985,
"acc_norm": 0.6285714285714286,
"acc_norm_stderr": 0.03093285879278985
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7313432835820896,
"acc_stderr": 0.03134328358208954,
"acc_norm": 0.7313432835820896,
"acc_norm_stderr": 0.03134328358208954
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366255,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366255
},
"harness|hendrycksTest-virology|5": {
"acc": 0.463855421686747,
"acc_stderr": 0.03882310850890593,
"acc_norm": 0.463855421686747,
"acc_norm_stderr": 0.03882310850890593
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.783625730994152,
"acc_stderr": 0.031581495393387324,
"acc_norm": 0.783625730994152,
"acc_norm_stderr": 0.031581495393387324
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23623011015911874,
"mc1_stderr": 0.014869755015871112,
"mc2": 0.35847150784146525,
"mc2_stderr": 0.013573219737109765
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_dhmeltzer__Llama-2-13b-hf-ds_eli5_1024_r_64_alpha_16_merged | 2023-10-03T17:11:57.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of dhmeltzer/Llama-2-13b-hf-ds_eli5_1024_r_64_alpha_16_merged
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [dhmeltzer/Llama-2-13b-hf-ds_eli5_1024_r_64_alpha_16_merged](https://huggingface.co/dhmeltzer/Llama-2-13b-hf-ds_eli5_1024_r_64_alpha_16_merged)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_dhmeltzer__Llama-2-13b-hf-ds_eli5_1024_r_64_alpha_16_merged\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-03T17:10:34.313268](https://huggingface.co/datasets/open-llm-leaderboard/details_dhmeltzer__Llama-2-13b-hf-ds_eli5_1024_r_64_alpha_16_merged/blob/main/results_2023-10-03T17-10-34.313268.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5507697130408812,\n\
\ \"acc_stderr\": 0.03436338098734656,\n \"acc_norm\": 0.5551056698939095,\n\
\ \"acc_norm_stderr\": 0.03434258299541656,\n \"mc1\": 0.2974296205630355,\n\
\ \"mc1_stderr\": 0.016002651487361,\n \"mc2\": 0.4423342601614022,\n\
\ \"mc2_stderr\": 0.014147398296090968\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5418088737201365,\n \"acc_stderr\": 0.014560220308714698,\n\
\ \"acc_norm\": 0.591296928327645,\n \"acc_norm_stderr\": 0.014365750345427\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6149173471420036,\n\
\ \"acc_stderr\": 0.004856203374715453,\n \"acc_norm\": 0.8212507468631747,\n\
\ \"acc_norm_stderr\": 0.0038235918141330326\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4962962962962963,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.4962962962962963,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5131578947368421,\n \"acc_stderr\": 0.04067533136309173,\n\
\ \"acc_norm\": 0.5131578947368421,\n \"acc_norm_stderr\": 0.04067533136309173\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5924528301886792,\n \"acc_stderr\": 0.030242233800854494,\n\
\ \"acc_norm\": 0.5924528301886792,\n \"acc_norm_stderr\": 0.030242233800854494\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5625,\n\
\ \"acc_stderr\": 0.04148415739394154,\n \"acc_norm\": 0.5625,\n \
\ \"acc_norm_stderr\": 0.04148415739394154\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n\
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5202312138728323,\n\
\ \"acc_stderr\": 0.03809342081273957,\n \"acc_norm\": 0.5202312138728323,\n\
\ \"acc_norm_stderr\": 0.03809342081273957\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.04280105837364396,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.04280105837364396\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4297872340425532,\n \"acc_stderr\": 0.03236214467715564,\n\
\ \"acc_norm\": 0.4297872340425532,\n \"acc_norm_stderr\": 0.03236214467715564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3253968253968254,\n \"acc_stderr\": 0.024130158299762602,\n \"\
acc_norm\": 0.3253968253968254,\n \"acc_norm_stderr\": 0.024130158299762602\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n\
\ \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n\
\ \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.667741935483871,\n\
\ \"acc_stderr\": 0.0267955608481228,\n \"acc_norm\": 0.667741935483871,\n\
\ \"acc_norm_stderr\": 0.0267955608481228\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.43349753694581283,\n \"acc_stderr\": 0.03486731727419872,\n\
\ \"acc_norm\": 0.43349753694581283,\n \"acc_norm_stderr\": 0.03486731727419872\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6484848484848484,\n \"acc_stderr\": 0.037282069986826503,\n\
\ \"acc_norm\": 0.6484848484848484,\n \"acc_norm_stderr\": 0.037282069986826503\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6616161616161617,\n \"acc_stderr\": 0.03371124142626302,\n \"\
acc_norm\": 0.6616161616161617,\n \"acc_norm_stderr\": 0.03371124142626302\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8341968911917098,\n \"acc_stderr\": 0.026839845022314415,\n\
\ \"acc_norm\": 0.8341968911917098,\n \"acc_norm_stderr\": 0.026839845022314415\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5102564102564102,\n \"acc_stderr\": 0.025345672221942374,\n\
\ \"acc_norm\": 0.5102564102564102,\n \"acc_norm_stderr\": 0.025345672221942374\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.29259259259259257,\n \"acc_stderr\": 0.02773896963217609,\n \
\ \"acc_norm\": 0.29259259259259257,\n \"acc_norm_stderr\": 0.02773896963217609\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.542016806722689,\n \"acc_stderr\": 0.03236361111951941,\n \
\ \"acc_norm\": 0.542016806722689,\n \"acc_norm_stderr\": 0.03236361111951941\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2913907284768212,\n \"acc_stderr\": 0.037101857261199946,\n \"\
acc_norm\": 0.2913907284768212,\n \"acc_norm_stderr\": 0.037101857261199946\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7486238532110092,\n \"acc_stderr\": 0.018599206360287415,\n \"\
acc_norm\": 0.7486238532110092,\n \"acc_norm_stderr\": 0.018599206360287415\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4583333333333333,\n \"acc_stderr\": 0.03398110890294636,\n \"\
acc_norm\": 0.4583333333333333,\n \"acc_norm_stderr\": 0.03398110890294636\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7745098039215687,\n \"acc_stderr\": 0.02933116229425173,\n \"\
acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.02933116229425173\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7257383966244726,\n \"acc_stderr\": 0.029041333510598028,\n \
\ \"acc_norm\": 0.7257383966244726,\n \"acc_norm_stderr\": 0.029041333510598028\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6502242152466368,\n\
\ \"acc_stderr\": 0.03200736719484503,\n \"acc_norm\": 0.6502242152466368,\n\
\ \"acc_norm_stderr\": 0.03200736719484503\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6259541984732825,\n \"acc_stderr\": 0.042438692422305246,\n\
\ \"acc_norm\": 0.6259541984732825,\n \"acc_norm_stderr\": 0.042438692422305246\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7024793388429752,\n \"acc_stderr\": 0.04173349148083499,\n \"\
acc_norm\": 0.7024793388429752,\n \"acc_norm_stderr\": 0.04173349148083499\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7037037037037037,\n\
\ \"acc_stderr\": 0.04414343666854933,\n \"acc_norm\": 0.7037037037037037,\n\
\ \"acc_norm_stderr\": 0.04414343666854933\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6625766871165644,\n \"acc_stderr\": 0.03714908409935575,\n\
\ \"acc_norm\": 0.6625766871165644,\n \"acc_norm_stderr\": 0.03714908409935575\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n\
\ \"acc_stderr\": 0.0432704093257873,\n \"acc_norm\": 0.29464285714285715,\n\
\ \"acc_norm_stderr\": 0.0432704093257873\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260595,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260595\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7991452991452992,\n\
\ \"acc_stderr\": 0.02624677294689047,\n \"acc_norm\": 0.7991452991452992,\n\
\ \"acc_norm_stderr\": 0.02624677294689047\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.55,\n \"acc_stderr\": 0.04999999999999999,\n \
\ \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.04999999999999999\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7445721583652618,\n\
\ \"acc_stderr\": 0.015594955384455766,\n \"acc_norm\": 0.7445721583652618,\n\
\ \"acc_norm_stderr\": 0.015594955384455766\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.638728323699422,\n \"acc_stderr\": 0.0258622018522779,\n\
\ \"acc_norm\": 0.638728323699422,\n \"acc_norm_stderr\": 0.0258622018522779\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.36312849162011174,\n\
\ \"acc_stderr\": 0.0160837499868537,\n \"acc_norm\": 0.36312849162011174,\n\
\ \"acc_norm_stderr\": 0.0160837499868537\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6045751633986928,\n \"acc_stderr\": 0.027996723180631462,\n\
\ \"acc_norm\": 0.6045751633986928,\n \"acc_norm_stderr\": 0.027996723180631462\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6495176848874598,\n\
\ \"acc_stderr\": 0.027098652621301754,\n \"acc_norm\": 0.6495176848874598,\n\
\ \"acc_norm_stderr\": 0.027098652621301754\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6512345679012346,\n \"acc_stderr\": 0.026517597724465013,\n\
\ \"acc_norm\": 0.6512345679012346,\n \"acc_norm_stderr\": 0.026517597724465013\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3971631205673759,\n \"acc_stderr\": 0.0291898056735871,\n \
\ \"acc_norm\": 0.3971631205673759,\n \"acc_norm_stderr\": 0.0291898056735871\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41460234680573665,\n\
\ \"acc_stderr\": 0.012582597058908284,\n \"acc_norm\": 0.41460234680573665,\n\
\ \"acc_norm_stderr\": 0.012582597058908284\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5036764705882353,\n \"acc_stderr\": 0.030372015885428195,\n\
\ \"acc_norm\": 0.5036764705882353,\n \"acc_norm_stderr\": 0.030372015885428195\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5424836601307189,\n \"acc_stderr\": 0.020154685712590888,\n \
\ \"acc_norm\": 0.5424836601307189,\n \"acc_norm_stderr\": 0.020154685712590888\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n\
\ \"acc_stderr\": 0.04631381319425465,\n \"acc_norm\": 0.6272727272727273,\n\
\ \"acc_norm_stderr\": 0.04631381319425465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6244897959183674,\n \"acc_stderr\": 0.03100120903989484,\n\
\ \"acc_norm\": 0.6244897959183674,\n \"acc_norm_stderr\": 0.03100120903989484\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7313432835820896,\n\
\ \"acc_stderr\": 0.03134328358208954,\n \"acc_norm\": 0.7313432835820896,\n\
\ \"acc_norm_stderr\": 0.03134328358208954\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4457831325301205,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.4457831325301205,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7660818713450293,\n \"acc_stderr\": 0.03246721765117825,\n\
\ \"acc_norm\": 0.7660818713450293,\n \"acc_norm_stderr\": 0.03246721765117825\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2974296205630355,\n\
\ \"mc1_stderr\": 0.016002651487361,\n \"mc2\": 0.4423342601614022,\n\
\ \"mc2_stderr\": 0.014147398296090968\n }\n}\n```"
repo_url: https://huggingface.co/dhmeltzer/Llama-2-13b-hf-ds_eli5_1024_r_64_alpha_16_merged
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_03T17_10_34.313268
path:
- '**/details_harness|arc:challenge|25_2023-10-03T17-10-34.313268.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-03T17-10-34.313268.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_03T17_10_34.313268
path:
- '**/details_harness|hellaswag|10_2023-10-03T17-10-34.313268.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-03T17-10-34.313268.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_03T17_10_34.313268
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-10-34.313268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-10-34.313268.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-10-34.313268.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_03T17_10_34.313268
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-10-34.313268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-10-34.313268.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_03T17_10_34.313268
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-10-34.313268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-10-34.313268.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_03T17_10_34.313268
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-10-34.313268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-10-34.313268.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_03T17_10_34.313268
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-10-34.313268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-10-34.313268.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_03T17_10_34.313268
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-10-34.313268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-10-34.313268.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_03T17_10_34.313268
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-10-34.313268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-10-34.313268.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_03T17_10_34.313268
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-10-34.313268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-10-34.313268.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_03T17_10_34.313268
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-10-34.313268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-10-34.313268.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_03T17_10_34.313268
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-10-34.313268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-10-34.313268.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_03T17_10_34.313268
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-10-34.313268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-10-34.313268.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_03T17_10_34.313268
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-10-34.313268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-10-34.313268.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_03T17_10_34.313268
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-10-34.313268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-10-34.313268.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_03T17_10_34.313268
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-10-34.313268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-10-34.313268.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_03T17_10_34.313268
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-10-34.313268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-10-34.313268.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_03T17_10_34.313268
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-10-34.313268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-10-34.313268.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_03T17_10_34.313268
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-10-34.313268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-10-34.313268.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_03T17_10_34.313268
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-10-34.313268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-10-34.313268.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_03T17_10_34.313268
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-10-34.313268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-10-34.313268.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_03T17_10_34.313268
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-10-34.313268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-10-34.313268.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_03T17_10_34.313268
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-10-34.313268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-10-34.313268.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_03T17_10_34.313268
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-10-34.313268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-10-34.313268.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_03T17_10_34.313268
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-10-34.313268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-10-34.313268.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_03T17_10_34.313268
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-10-34.313268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-10-34.313268.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_03T17_10_34.313268
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-10-34.313268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-10-34.313268.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_03T17_10_34.313268
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-10-34.313268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-10-34.313268.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_03T17_10_34.313268
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-10-34.313268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-10-34.313268.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_03T17_10_34.313268
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-10-34.313268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-10-34.313268.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_03T17_10_34.313268
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-10-34.313268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-10-34.313268.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_03T17_10_34.313268
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-10-34.313268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-10-34.313268.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_03T17_10_34.313268
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-10-34.313268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-10-34.313268.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_03T17_10_34.313268
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-10-34.313268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-10-34.313268.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_03T17_10_34.313268
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-10-34.313268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-10-34.313268.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_03T17_10_34.313268
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-10-34.313268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-10-34.313268.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_03T17_10_34.313268
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-10-34.313268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-10-34.313268.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_03T17_10_34.313268
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-10-34.313268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-10-34.313268.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_03T17_10_34.313268
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-10-34.313268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-10-34.313268.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_03T17_10_34.313268
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-10-34.313268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-10-34.313268.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_03T17_10_34.313268
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-10-34.313268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-10-34.313268.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_03T17_10_34.313268
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T17-10-34.313268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T17-10-34.313268.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_03T17_10_34.313268
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-10-34.313268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-10-34.313268.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_03T17_10_34.313268
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-10-34.313268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-10-34.313268.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_03T17_10_34.313268
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-10-34.313268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-10-34.313268.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_03T17_10_34.313268
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-10-34.313268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-10-34.313268.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_03T17_10_34.313268
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-10-34.313268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-10-34.313268.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_03T17_10_34.313268
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-10-34.313268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-10-34.313268.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_03T17_10_34.313268
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-10-34.313268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-10-34.313268.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_03T17_10_34.313268
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-10-34.313268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-10-34.313268.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_03T17_10_34.313268
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-10-34.313268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-10-34.313268.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_03T17_10_34.313268
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-10-34.313268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-10-34.313268.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_03T17_10_34.313268
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-10-34.313268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-10-34.313268.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_03T17_10_34.313268
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-10-34.313268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-10-34.313268.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_03T17_10_34.313268
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-10-34.313268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-10-34.313268.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_03T17_10_34.313268
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-10-34.313268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-10-34.313268.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_03T17_10_34.313268
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-10-34.313268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-10-34.313268.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_03T17_10_34.313268
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-10-34.313268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-10-34.313268.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_03T17_10_34.313268
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-10-34.313268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-10-34.313268.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_03T17_10_34.313268
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-10-34.313268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-10-34.313268.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_03T17_10_34.313268
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T17-10-34.313268.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T17-10-34.313268.parquet'
- config_name: results
data_files:
- split: 2023_10_03T17_10_34.313268
path:
- results_2023-10-03T17-10-34.313268.parquet
- split: latest
path:
- results_2023-10-03T17-10-34.313268.parquet
---
# Dataset Card for Evaluation run of dhmeltzer/Llama-2-13b-hf-ds_eli5_1024_r_64_alpha_16_merged
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/dhmeltzer/Llama-2-13b-hf-ds_eli5_1024_r_64_alpha_16_merged
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [dhmeltzer/Llama-2-13b-hf-ds_eli5_1024_r_64_alpha_16_merged](https://huggingface.co/dhmeltzer/Llama-2-13b-hf-ds_eli5_1024_r_64_alpha_16_merged) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_dhmeltzer__Llama-2-13b-hf-ds_eli5_1024_r_64_alpha_16_merged",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-03T17:10:34.313268](https://huggingface.co/datasets/open-llm-leaderboard/details_dhmeltzer__Llama-2-13b-hf-ds_eli5_1024_r_64_alpha_16_merged/blob/main/results_2023-10-03T17-10-34.313268.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5507697130408812,
"acc_stderr": 0.03436338098734656,
"acc_norm": 0.5551056698939095,
"acc_norm_stderr": 0.03434258299541656,
"mc1": 0.2974296205630355,
"mc1_stderr": 0.016002651487361,
"mc2": 0.4423342601614022,
"mc2_stderr": 0.014147398296090968
},
"harness|arc:challenge|25": {
"acc": 0.5418088737201365,
"acc_stderr": 0.014560220308714698,
"acc_norm": 0.591296928327645,
"acc_norm_stderr": 0.014365750345427
},
"harness|hellaswag|10": {
"acc": 0.6149173471420036,
"acc_stderr": 0.004856203374715453,
"acc_norm": 0.8212507468631747,
"acc_norm_stderr": 0.0038235918141330326
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4962962962962963,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.4962962962962963,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5131578947368421,
"acc_stderr": 0.04067533136309173,
"acc_norm": 0.5131578947368421,
"acc_norm_stderr": 0.04067533136309173
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5924528301886792,
"acc_stderr": 0.030242233800854494,
"acc_norm": 0.5924528301886792,
"acc_norm_stderr": 0.030242233800854494
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5625,
"acc_stderr": 0.04148415739394154,
"acc_norm": 0.5625,
"acc_norm_stderr": 0.04148415739394154
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5202312138728323,
"acc_stderr": 0.03809342081273957,
"acc_norm": 0.5202312138728323,
"acc_norm_stderr": 0.03809342081273957
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.04280105837364396,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.04280105837364396
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4297872340425532,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.4297872340425532,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159394,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159394
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3253968253968254,
"acc_stderr": 0.024130158299762602,
"acc_norm": 0.3253968253968254,
"acc_norm_stderr": 0.024130158299762602
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.04325506042017086,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.04325506042017086
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.667741935483871,
"acc_stderr": 0.0267955608481228,
"acc_norm": 0.667741935483871,
"acc_norm_stderr": 0.0267955608481228
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.43349753694581283,
"acc_stderr": 0.03486731727419872,
"acc_norm": 0.43349753694581283,
"acc_norm_stderr": 0.03486731727419872
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6484848484848484,
"acc_stderr": 0.037282069986826503,
"acc_norm": 0.6484848484848484,
"acc_norm_stderr": 0.037282069986826503
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6616161616161617,
"acc_stderr": 0.03371124142626302,
"acc_norm": 0.6616161616161617,
"acc_norm_stderr": 0.03371124142626302
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8341968911917098,
"acc_stderr": 0.026839845022314415,
"acc_norm": 0.8341968911917098,
"acc_norm_stderr": 0.026839845022314415
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5102564102564102,
"acc_stderr": 0.025345672221942374,
"acc_norm": 0.5102564102564102,
"acc_norm_stderr": 0.025345672221942374
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.29259259259259257,
"acc_stderr": 0.02773896963217609,
"acc_norm": 0.29259259259259257,
"acc_norm_stderr": 0.02773896963217609
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.542016806722689,
"acc_stderr": 0.03236361111951941,
"acc_norm": 0.542016806722689,
"acc_norm_stderr": 0.03236361111951941
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2913907284768212,
"acc_stderr": 0.037101857261199946,
"acc_norm": 0.2913907284768212,
"acc_norm_stderr": 0.037101857261199946
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7486238532110092,
"acc_stderr": 0.018599206360287415,
"acc_norm": 0.7486238532110092,
"acc_norm_stderr": 0.018599206360287415
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4583333333333333,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.4583333333333333,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7745098039215687,
"acc_stderr": 0.02933116229425173,
"acc_norm": 0.7745098039215687,
"acc_norm_stderr": 0.02933116229425173
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7257383966244726,
"acc_stderr": 0.029041333510598028,
"acc_norm": 0.7257383966244726,
"acc_norm_stderr": 0.029041333510598028
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6502242152466368,
"acc_stderr": 0.03200736719484503,
"acc_norm": 0.6502242152466368,
"acc_norm_stderr": 0.03200736719484503
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6259541984732825,
"acc_stderr": 0.042438692422305246,
"acc_norm": 0.6259541984732825,
"acc_norm_stderr": 0.042438692422305246
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7024793388429752,
"acc_stderr": 0.04173349148083499,
"acc_norm": 0.7024793388429752,
"acc_norm_stderr": 0.04173349148083499
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.04414343666854933,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.04414343666854933
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6625766871165644,
"acc_stderr": 0.03714908409935575,
"acc_norm": 0.6625766871165644,
"acc_norm_stderr": 0.03714908409935575
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.0432704093257873,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.0432704093257873
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.04354631077260595,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.04354631077260595
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7991452991452992,
"acc_stderr": 0.02624677294689047,
"acc_norm": 0.7991452991452992,
"acc_norm_stderr": 0.02624677294689047
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.55,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.55,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7445721583652618,
"acc_stderr": 0.015594955384455766,
"acc_norm": 0.7445721583652618,
"acc_norm_stderr": 0.015594955384455766
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.638728323699422,
"acc_stderr": 0.0258622018522779,
"acc_norm": 0.638728323699422,
"acc_norm_stderr": 0.0258622018522779
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.36312849162011174,
"acc_stderr": 0.0160837499868537,
"acc_norm": 0.36312849162011174,
"acc_norm_stderr": 0.0160837499868537
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6045751633986928,
"acc_stderr": 0.027996723180631462,
"acc_norm": 0.6045751633986928,
"acc_norm_stderr": 0.027996723180631462
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6495176848874598,
"acc_stderr": 0.027098652621301754,
"acc_norm": 0.6495176848874598,
"acc_norm_stderr": 0.027098652621301754
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6512345679012346,
"acc_stderr": 0.026517597724465013,
"acc_norm": 0.6512345679012346,
"acc_norm_stderr": 0.026517597724465013
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3971631205673759,
"acc_stderr": 0.0291898056735871,
"acc_norm": 0.3971631205673759,
"acc_norm_stderr": 0.0291898056735871
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.41460234680573665,
"acc_stderr": 0.012582597058908284,
"acc_norm": 0.41460234680573665,
"acc_norm_stderr": 0.012582597058908284
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5036764705882353,
"acc_stderr": 0.030372015885428195,
"acc_norm": 0.5036764705882353,
"acc_norm_stderr": 0.030372015885428195
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5424836601307189,
"acc_stderr": 0.020154685712590888,
"acc_norm": 0.5424836601307189,
"acc_norm_stderr": 0.020154685712590888
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.04631381319425465,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.04631381319425465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6244897959183674,
"acc_stderr": 0.03100120903989484,
"acc_norm": 0.6244897959183674,
"acc_norm_stderr": 0.03100120903989484
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7313432835820896,
"acc_stderr": 0.03134328358208954,
"acc_norm": 0.7313432835820896,
"acc_norm_stderr": 0.03134328358208954
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4457831325301205,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.4457831325301205,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7660818713450293,
"acc_stderr": 0.03246721765117825,
"acc_norm": 0.7660818713450293,
"acc_norm_stderr": 0.03246721765117825
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2974296205630355,
"mc1_stderr": 0.016002651487361,
"mc2": 0.4423342601614022,
"mc2_stderr": 0.014147398296090968
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
atom-in-the-universe/bild-a24a9e03-6bfe-44ee-a465-5a4ee0be1332 | 2023-10-03T17:12:05.000Z | [
"region:us"
] | atom-in-the-universe | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_breadlicker45__dough-base-001 | 2023-10-03T17:13:45.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of breadlicker45/dough-base-001
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [breadlicker45/dough-base-001](https://huggingface.co/breadlicker45/dough-base-001)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_breadlicker45__dough-base-001\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-03T17:12:28.280269](https://huggingface.co/datasets/open-llm-leaderboard/details_breadlicker45__dough-base-001/blob/main/results_2023-10-03T17-12-28.280269.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23101378436942155,\n\
\ \"acc_stderr\": 0.03070203776075275,\n \"acc_norm\": 0.23167967796168026,\n\
\ \"acc_norm_stderr\": 0.030718436434856174,\n \"mc1\": 0.26560587515299877,\n\
\ \"mc1_stderr\": 0.015461027627253586,\n \"mc2\": 0.5340310687107248,\n\
\ \"mc2_stderr\": 0.016085980419349055\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.1885665529010239,\n \"acc_stderr\": 0.011430897647675818,\n\
\ \"acc_norm\": 0.23890784982935154,\n \"acc_norm_stderr\": 0.012461071376316623\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2586138219478192,\n\
\ \"acc_stderr\": 0.00436978052982401,\n \"acc_norm\": 0.24756024696275641,\n\
\ \"acc_norm_stderr\": 0.0043071285732852365\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.18518518518518517,\n\
\ \"acc_stderr\": 0.03355677216313142,\n \"acc_norm\": 0.18518518518518517,\n\
\ \"acc_norm_stderr\": 0.03355677216313142\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n\
\ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n\
\ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.02528839450289137,\n\
\ \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.02528839450289137\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n\
\ \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n\
\ \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n\
\ \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n\
\ \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"\
acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"\
acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n \"\
acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"\
acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n\
\ \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.20256410256410257,\n \"acc_stderr\": 0.020377660970371372,\n\
\ \"acc_norm\": 0.20256410256410257,\n \"acc_norm_stderr\": 0.020377660970371372\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \
\ \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n\
\ \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"\
acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"\
acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"\
acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n\
\ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n\
\ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n\
\ \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n\
\ \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23754789272030652,\n\
\ \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.23754789272030652,\n\
\ \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n\
\ \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n\
\ \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n\
\ \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \
\ \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n\
\ \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n\
\ \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n\
\ \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n\
\ \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n\
\ \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n\
\ \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.2891566265060241,\n \"acc_stderr\": 0.03529486801511115,\n\
\ \"acc_norm\": 0.2891566265060241,\n \"acc_norm_stderr\": 0.03529486801511115\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3216374269005848,\n\
\ \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.3216374269005848,\n\
\ \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.26560587515299877,\n \"mc1_stderr\": 0.015461027627253586,\n\
\ \"mc2\": 0.5340310687107248,\n \"mc2_stderr\": 0.016085980419349055\n\
\ }\n}\n```"
repo_url: https://huggingface.co/breadlicker45/dough-base-001
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|arc:challenge|25_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hellaswag|10_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T17-12-28.280269.parquet'
- config_name: results
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- results_2023-10-03T17-12-28.280269.parquet
- split: latest
path:
- results_2023-10-03T17-12-28.280269.parquet
---
# Dataset Card for Evaluation run of breadlicker45/dough-base-001
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/breadlicker45/dough-base-001
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [breadlicker45/dough-base-001](https://huggingface.co/breadlicker45/dough-base-001) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_breadlicker45__dough-base-001",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-03T17:12:28.280269](https://huggingface.co/datasets/open-llm-leaderboard/details_breadlicker45__dough-base-001/blob/main/results_2023-10-03T17-12-28.280269.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.23101378436942155,
"acc_stderr": 0.03070203776075275,
"acc_norm": 0.23167967796168026,
"acc_norm_stderr": 0.030718436434856174,
"mc1": 0.26560587515299877,
"mc1_stderr": 0.015461027627253586,
"mc2": 0.5340310687107248,
"mc2_stderr": 0.016085980419349055
},
"harness|arc:challenge|25": {
"acc": 0.1885665529010239,
"acc_stderr": 0.011430897647675818,
"acc_norm": 0.23890784982935154,
"acc_norm_stderr": 0.012461071376316623
},
"harness|hellaswag|10": {
"acc": 0.2586138219478192,
"acc_stderr": 0.00436978052982401,
"acc_norm": 0.24756024696275641,
"acc_norm_stderr": 0.0043071285732852365
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03355677216313142,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03355677216313142
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21509433962264152,
"acc_stderr": 0.02528839450289137,
"acc_norm": 0.21509433962264152,
"acc_norm_stderr": 0.02528839450289137
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.02094048156533486,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.02094048156533486
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04040610178208841,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04040610178208841
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1774193548387097,
"acc_stderr": 0.02173254068932927,
"acc_norm": 0.1774193548387097,
"acc_norm_stderr": 0.02173254068932927
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15270935960591134,
"acc_stderr": 0.02530890453938063,
"acc_norm": 0.15270935960591134,
"acc_norm_stderr": 0.02530890453938063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860664,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860664
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.20256410256410257,
"acc_stderr": 0.020377660970371372,
"acc_norm": 0.20256410256410257,
"acc_norm_stderr": 0.020377660970371372
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.024882116857655075,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.024882116857655075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1926605504587156,
"acc_stderr": 0.016909276884936094,
"acc_norm": 0.1926605504587156,
"acc_norm_stderr": 0.016909276884936094
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1527777777777778,
"acc_stderr": 0.024536326026134224,
"acc_norm": 0.1527777777777778,
"acc_norm_stderr": 0.024536326026134224
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.31390134529147984,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.31390134529147984,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2905982905982906,
"acc_stderr": 0.02974504857267404,
"acc_norm": 0.2905982905982906,
"acc_norm_stderr": 0.02974504857267404
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23754789272030652,
"acc_stderr": 0.015218733046150193,
"acc_norm": 0.23754789272030652,
"acc_norm_stderr": 0.015218733046150193
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432417,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432417
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2891566265060241,
"acc_stderr": 0.03529486801511115,
"acc_norm": 0.2891566265060241,
"acc_norm_stderr": 0.03529486801511115
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.26560587515299877,
"mc1_stderr": 0.015461027627253586,
"mc2": 0.5340310687107248,
"mc2_stderr": 0.016085980419349055
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_dhmeltzer__Llama-2-13b-hf-eli5-wiki-1024_r_64_alpha_16_merged | 2023-10-03T17:18:08.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of dhmeltzer/Llama-2-13b-hf-eli5-wiki-1024_r_64_alpha_16_merged
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [dhmeltzer/Llama-2-13b-hf-eli5-wiki-1024_r_64_alpha_16_merged](https://huggingface.co/dhmeltzer/Llama-2-13b-hf-eli5-wiki-1024_r_64_alpha_16_merged)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_dhmeltzer__Llama-2-13b-hf-eli5-wiki-1024_r_64_alpha_16_merged\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-03T17:16:44.707859](https://huggingface.co/datasets/open-llm-leaderboard/details_dhmeltzer__Llama-2-13b-hf-eli5-wiki-1024_r_64_alpha_16_merged/blob/main/results_2023-10-03T17-16-44.707859.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5509206673641549,\n\
\ \"acc_stderr\": 0.034392600210851124,\n \"acc_norm\": 0.5552513303180491,\n\
\ \"acc_norm_stderr\": 0.034372095380618445,\n \"mc1\": 0.26193390452876375,\n\
\ \"mc1_stderr\": 0.01539211880501503,\n \"mc2\": 0.402620834492213,\n\
\ \"mc2_stderr\": 0.013786763339278897\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5418088737201365,\n \"acc_stderr\": 0.014560220308714698,\n\
\ \"acc_norm\": 0.5895904436860068,\n \"acc_norm_stderr\": 0.014374922192642664\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6116311491734714,\n\
\ \"acc_stderr\": 0.00486383136484807,\n \"acc_norm\": 0.8193586934873531,\n\
\ \"acc_norm_stderr\": 0.003839344497191943\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45925925925925926,\n\
\ \"acc_stderr\": 0.04304979692464243,\n \"acc_norm\": 0.45925925925925926,\n\
\ \"acc_norm_stderr\": 0.04304979692464243\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5131578947368421,\n \"acc_stderr\": 0.04067533136309174,\n\
\ \"acc_norm\": 0.5131578947368421,\n \"acc_norm_stderr\": 0.04067533136309174\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n\
\ \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6113207547169811,\n \"acc_stderr\": 0.030000485448675986,\n\
\ \"acc_norm\": 0.6113207547169811,\n \"acc_norm_stderr\": 0.030000485448675986\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5208333333333334,\n\
\ \"acc_stderr\": 0.04177578950739993,\n \"acc_norm\": 0.5208333333333334,\n\
\ \"acc_norm_stderr\": 0.04177578950739993\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.049888765156985884,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.049888765156985884\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5028901734104047,\n\
\ \"acc_stderr\": 0.038124005659748335,\n \"acc_norm\": 0.5028901734104047,\n\
\ \"acc_norm_stderr\": 0.038124005659748335\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.042801058373643966,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.042801058373643966\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.44680851063829785,\n \"acc_stderr\": 0.0325005368436584,\n\
\ \"acc_norm\": 0.44680851063829785,\n \"acc_norm_stderr\": 0.0325005368436584\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n\
\ \"acc_stderr\": 0.04303684033537314,\n \"acc_norm\": 0.2982456140350877,\n\
\ \"acc_norm_stderr\": 0.04303684033537314\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.31746031746031744,\n \"acc_stderr\": 0.02397386199899207,\n \"\
acc_norm\": 0.31746031746031744,\n \"acc_norm_stderr\": 0.02397386199899207\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n\
\ \"acc_stderr\": 0.04390259265377563,\n \"acc_norm\": 0.40476190476190477,\n\
\ \"acc_norm_stderr\": 0.04390259265377563\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.667741935483871,\n\
\ \"acc_stderr\": 0.0267955608481228,\n \"acc_norm\": 0.667741935483871,\n\
\ \"acc_norm_stderr\": 0.0267955608481228\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.03481904844438804,\n\
\ \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.03481904844438804\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\"\
: 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.03713158067481913,\n\
\ \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.03713158067481913\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6616161616161617,\n \"acc_stderr\": 0.03371124142626302,\n \"\
acc_norm\": 0.6616161616161617,\n \"acc_norm_stderr\": 0.03371124142626302\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8186528497409327,\n \"acc_stderr\": 0.02780703236068609,\n\
\ \"acc_norm\": 0.8186528497409327,\n \"acc_norm_stderr\": 0.02780703236068609\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5076923076923077,\n \"acc_stderr\": 0.02534800603153477,\n \
\ \"acc_norm\": 0.5076923076923077,\n \"acc_norm_stderr\": 0.02534800603153477\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.28888888888888886,\n \"acc_stderr\": 0.02763490726417854,\n \
\ \"acc_norm\": 0.28888888888888886,\n \"acc_norm_stderr\": 0.02763490726417854\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5462184873949579,\n \"acc_stderr\": 0.03233943468182087,\n \
\ \"acc_norm\": 0.5462184873949579,\n \"acc_norm_stderr\": 0.03233943468182087\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526733,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526733\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7559633027522936,\n \"acc_stderr\": 0.018415286351416416,\n \"\
acc_norm\": 0.7559633027522936,\n \"acc_norm_stderr\": 0.018415286351416416\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.44907407407407407,\n \"acc_stderr\": 0.03392238405321616,\n \"\
acc_norm\": 0.44907407407407407,\n \"acc_norm_stderr\": 0.03392238405321616\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7254901960784313,\n \"acc_stderr\": 0.03132179803083291,\n \"\
acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.03132179803083291\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7215189873417721,\n \"acc_stderr\": 0.029178682304842544,\n \
\ \"acc_norm\": 0.7215189873417721,\n \"acc_norm_stderr\": 0.029178682304842544\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6367713004484304,\n\
\ \"acc_stderr\": 0.03227790442850499,\n \"acc_norm\": 0.6367713004484304,\n\
\ \"acc_norm_stderr\": 0.03227790442850499\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6564885496183206,\n \"acc_stderr\": 0.041649760719448786,\n\
\ \"acc_norm\": 0.6564885496183206,\n \"acc_norm_stderr\": 0.041649760719448786\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7107438016528925,\n \"acc_stderr\": 0.04139112727635463,\n \"\
acc_norm\": 0.7107438016528925,\n \"acc_norm_stderr\": 0.04139112727635463\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6851851851851852,\n\
\ \"acc_stderr\": 0.04489931073591312,\n \"acc_norm\": 0.6851851851851852,\n\
\ \"acc_norm_stderr\": 0.04489931073591312\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.656441717791411,\n \"acc_stderr\": 0.037311335196738925,\n\
\ \"acc_norm\": 0.656441717791411,\n \"acc_norm_stderr\": 0.037311335196738925\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2767857142857143,\n\
\ \"acc_stderr\": 0.04246624336697624,\n \"acc_norm\": 0.2767857142857143,\n\
\ \"acc_norm_stderr\": 0.04246624336697624\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260595,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260595\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.811965811965812,\n\
\ \"acc_stderr\": 0.02559819368665224,\n \"acc_norm\": 0.811965811965812,\n\
\ \"acc_norm_stderr\": 0.02559819368665224\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7458492975734355,\n\
\ \"acc_stderr\": 0.015569254692045759,\n \"acc_norm\": 0.7458492975734355,\n\
\ \"acc_norm_stderr\": 0.015569254692045759\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6416184971098265,\n \"acc_stderr\": 0.025816756791584187,\n\
\ \"acc_norm\": 0.6416184971098265,\n \"acc_norm_stderr\": 0.025816756791584187\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3474860335195531,\n\
\ \"acc_stderr\": 0.015925564060208154,\n \"acc_norm\": 0.3474860335195531,\n\
\ \"acc_norm_stderr\": 0.015925564060208154\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5849673202614379,\n \"acc_stderr\": 0.028213504177824093,\n\
\ \"acc_norm\": 0.5849673202614379,\n \"acc_norm_stderr\": 0.028213504177824093\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6366559485530546,\n\
\ \"acc_stderr\": 0.027316847674192707,\n \"acc_norm\": 0.6366559485530546,\n\
\ \"acc_norm_stderr\": 0.027316847674192707\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6327160493827161,\n \"acc_stderr\": 0.026822801759507894,\n\
\ \"acc_norm\": 0.6327160493827161,\n \"acc_norm_stderr\": 0.026822801759507894\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4078014184397163,\n \"acc_stderr\": 0.029316011776343555,\n \
\ \"acc_norm\": 0.4078014184397163,\n \"acc_norm_stderr\": 0.029316011776343555\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41134289439374183,\n\
\ \"acc_stderr\": 0.01256788267380368,\n \"acc_norm\": 0.41134289439374183,\n\
\ \"acc_norm_stderr\": 0.01256788267380368\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5073529411764706,\n \"acc_stderr\": 0.030369552523902173,\n\
\ \"acc_norm\": 0.5073529411764706,\n \"acc_norm_stderr\": 0.030369552523902173\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5424836601307189,\n \"acc_stderr\": 0.020154685712590888,\n \
\ \"acc_norm\": 0.5424836601307189,\n \"acc_norm_stderr\": 0.020154685712590888\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6448979591836734,\n \"acc_stderr\": 0.030635655150387638,\n\
\ \"acc_norm\": 0.6448979591836734,\n \"acc_norm_stderr\": 0.030635655150387638\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7263681592039801,\n\
\ \"acc_stderr\": 0.031524391865554016,\n \"acc_norm\": 0.7263681592039801,\n\
\ \"acc_norm_stderr\": 0.031524391865554016\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4457831325301205,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.4457831325301205,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03188578017686398,\n\
\ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03188578017686398\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.26193390452876375,\n\
\ \"mc1_stderr\": 0.01539211880501503,\n \"mc2\": 0.402620834492213,\n\
\ \"mc2_stderr\": 0.013786763339278897\n }\n}\n```"
repo_url: https://huggingface.co/dhmeltzer/Llama-2-13b-hf-eli5-wiki-1024_r_64_alpha_16_merged
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|arc:challenge|25_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hellaswag|10_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T17-16-44.707859.parquet'
- config_name: results
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- results_2023-10-03T17-16-44.707859.parquet
- split: latest
path:
- results_2023-10-03T17-16-44.707859.parquet
---
# Dataset Card for Evaluation run of dhmeltzer/Llama-2-13b-hf-eli5-wiki-1024_r_64_alpha_16_merged
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/dhmeltzer/Llama-2-13b-hf-eli5-wiki-1024_r_64_alpha_16_merged
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [dhmeltzer/Llama-2-13b-hf-eli5-wiki-1024_r_64_alpha_16_merged](https://huggingface.co/dhmeltzer/Llama-2-13b-hf-eli5-wiki-1024_r_64_alpha_16_merged) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_dhmeltzer__Llama-2-13b-hf-eli5-wiki-1024_r_64_alpha_16_merged",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-03T17:16:44.707859](https://huggingface.co/datasets/open-llm-leaderboard/details_dhmeltzer__Llama-2-13b-hf-eli5-wiki-1024_r_64_alpha_16_merged/blob/main/results_2023-10-03T17-16-44.707859.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5509206673641549,
"acc_stderr": 0.034392600210851124,
"acc_norm": 0.5552513303180491,
"acc_norm_stderr": 0.034372095380618445,
"mc1": 0.26193390452876375,
"mc1_stderr": 0.01539211880501503,
"mc2": 0.402620834492213,
"mc2_stderr": 0.013786763339278897
},
"harness|arc:challenge|25": {
"acc": 0.5418088737201365,
"acc_stderr": 0.014560220308714698,
"acc_norm": 0.5895904436860068,
"acc_norm_stderr": 0.014374922192642664
},
"harness|hellaswag|10": {
"acc": 0.6116311491734714,
"acc_stderr": 0.00486383136484807,
"acc_norm": 0.8193586934873531,
"acc_norm_stderr": 0.003839344497191943
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45925925925925926,
"acc_stderr": 0.04304979692464243,
"acc_norm": 0.45925925925925926,
"acc_norm_stderr": 0.04304979692464243
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5131578947368421,
"acc_stderr": 0.04067533136309174,
"acc_norm": 0.5131578947368421,
"acc_norm_stderr": 0.04067533136309174
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6113207547169811,
"acc_stderr": 0.030000485448675986,
"acc_norm": 0.6113207547169811,
"acc_norm_stderr": 0.030000485448675986
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5208333333333334,
"acc_stderr": 0.04177578950739993,
"acc_norm": 0.5208333333333334,
"acc_norm_stderr": 0.04177578950739993
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.44,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5028901734104047,
"acc_stderr": 0.038124005659748335,
"acc_norm": 0.5028901734104047,
"acc_norm_stderr": 0.038124005659748335
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.042801058373643966,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.042801058373643966
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.44680851063829785,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.44680851063829785,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.04303684033537314,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.04303684033537314
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.02397386199899207,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.02397386199899207
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377563,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377563
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.667741935483871,
"acc_stderr": 0.0267955608481228,
"acc_norm": 0.667741935483871,
"acc_norm_stderr": 0.0267955608481228
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.03481904844438804,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.03481904844438804
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.03713158067481913,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.03713158067481913
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6616161616161617,
"acc_stderr": 0.03371124142626302,
"acc_norm": 0.6616161616161617,
"acc_norm_stderr": 0.03371124142626302
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8186528497409327,
"acc_stderr": 0.02780703236068609,
"acc_norm": 0.8186528497409327,
"acc_norm_stderr": 0.02780703236068609
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5076923076923077,
"acc_stderr": 0.02534800603153477,
"acc_norm": 0.5076923076923077,
"acc_norm_stderr": 0.02534800603153477
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.28888888888888886,
"acc_stderr": 0.02763490726417854,
"acc_norm": 0.28888888888888886,
"acc_norm_stderr": 0.02763490726417854
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5462184873949579,
"acc_stderr": 0.03233943468182087,
"acc_norm": 0.5462184873949579,
"acc_norm_stderr": 0.03233943468182087
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526733,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526733
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7559633027522936,
"acc_stderr": 0.018415286351416416,
"acc_norm": 0.7559633027522936,
"acc_norm_stderr": 0.018415286351416416
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.44907407407407407,
"acc_stderr": 0.03392238405321616,
"acc_norm": 0.44907407407407407,
"acc_norm_stderr": 0.03392238405321616
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.03132179803083291,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.03132179803083291
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7215189873417721,
"acc_stderr": 0.029178682304842544,
"acc_norm": 0.7215189873417721,
"acc_norm_stderr": 0.029178682304842544
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6367713004484304,
"acc_stderr": 0.03227790442850499,
"acc_norm": 0.6367713004484304,
"acc_norm_stderr": 0.03227790442850499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6564885496183206,
"acc_stderr": 0.041649760719448786,
"acc_norm": 0.6564885496183206,
"acc_norm_stderr": 0.041649760719448786
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7107438016528925,
"acc_stderr": 0.04139112727635463,
"acc_norm": 0.7107438016528925,
"acc_norm_stderr": 0.04139112727635463
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.04489931073591312,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.04489931073591312
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.656441717791411,
"acc_stderr": 0.037311335196738925,
"acc_norm": 0.656441717791411,
"acc_norm_stderr": 0.037311335196738925
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2767857142857143,
"acc_stderr": 0.04246624336697624,
"acc_norm": 0.2767857142857143,
"acc_norm_stderr": 0.04246624336697624
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.04354631077260595,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.04354631077260595
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.811965811965812,
"acc_stderr": 0.02559819368665224,
"acc_norm": 0.811965811965812,
"acc_norm_stderr": 0.02559819368665224
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7458492975734355,
"acc_stderr": 0.015569254692045759,
"acc_norm": 0.7458492975734355,
"acc_norm_stderr": 0.015569254692045759
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.025816756791584187,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.025816756791584187
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3474860335195531,
"acc_stderr": 0.015925564060208154,
"acc_norm": 0.3474860335195531,
"acc_norm_stderr": 0.015925564060208154
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5849673202614379,
"acc_stderr": 0.028213504177824093,
"acc_norm": 0.5849673202614379,
"acc_norm_stderr": 0.028213504177824093
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6366559485530546,
"acc_stderr": 0.027316847674192707,
"acc_norm": 0.6366559485530546,
"acc_norm_stderr": 0.027316847674192707
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6327160493827161,
"acc_stderr": 0.026822801759507894,
"acc_norm": 0.6327160493827161,
"acc_norm_stderr": 0.026822801759507894
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4078014184397163,
"acc_stderr": 0.029316011776343555,
"acc_norm": 0.4078014184397163,
"acc_norm_stderr": 0.029316011776343555
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.41134289439374183,
"acc_stderr": 0.01256788267380368,
"acc_norm": 0.41134289439374183,
"acc_norm_stderr": 0.01256788267380368
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5073529411764706,
"acc_stderr": 0.030369552523902173,
"acc_norm": 0.5073529411764706,
"acc_norm_stderr": 0.030369552523902173
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5424836601307189,
"acc_stderr": 0.020154685712590888,
"acc_norm": 0.5424836601307189,
"acc_norm_stderr": 0.020154685712590888
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6448979591836734,
"acc_stderr": 0.030635655150387638,
"acc_norm": 0.6448979591836734,
"acc_norm_stderr": 0.030635655150387638
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7263681592039801,
"acc_stderr": 0.031524391865554016,
"acc_norm": 0.7263681592039801,
"acc_norm_stderr": 0.031524391865554016
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4457831325301205,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.4457831325301205,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03188578017686398,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03188578017686398
},
"harness|truthfulqa:mc|0": {
"mc1": 0.26193390452876375,
"mc1_stderr": 0.01539211880501503,
"mc2": 0.402620834492213,
"mc2_stderr": 0.013786763339278897
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_LTC-AI-Labs__L2-7b-Base-test-WVG | 2023-10-03T17:20:32.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of LTC-AI-Labs/L2-7b-Base-test-WVG
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [LTC-AI-Labs/L2-7b-Base-test-WVG](https://huggingface.co/LTC-AI-Labs/L2-7b-Base-test-WVG)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_LTC-AI-Labs__L2-7b-Base-test-WVG\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-03T17:19:09.186622](https://huggingface.co/datasets/open-llm-leaderboard/details_LTC-AI-Labs__L2-7b-Base-test-WVG/blob/main/results_2023-10-03T17-19-09.186622.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5122076677774635,\n\
\ \"acc_stderr\": 0.034888652826956947,\n \"acc_norm\": 0.5157490760401192,\n\
\ \"acc_norm_stderr\": 0.0348750803996461,\n \"mc1\": 0.3084455324357405,\n\
\ \"mc1_stderr\": 0.01616803938315687,\n \"mc2\": 0.46277766867323933,\n\
\ \"mc2_stderr\": 0.015164855129314675\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5213310580204779,\n \"acc_stderr\": 0.014598087973127104,\n\
\ \"acc_norm\": 0.5426621160409556,\n \"acc_norm_stderr\": 0.014558106543924063\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5905198167695678,\n\
\ \"acc_stderr\": 0.004907329270272705,\n \"acc_norm\": 0.7781318462457678,\n\
\ \"acc_norm_stderr\": 0.004146537488135695\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5185185185185185,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.5185185185185185,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.04063302731486671,\n\
\ \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.04063302731486671\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5735849056603773,\n \"acc_stderr\": 0.030437794342983052,\n\
\ \"acc_norm\": 0.5735849056603773,\n \"acc_norm_stderr\": 0.030437794342983052\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5486111111111112,\n\
\ \"acc_stderr\": 0.04161402398403279,\n \"acc_norm\": 0.5486111111111112,\n\
\ \"acc_norm_stderr\": 0.04161402398403279\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n\
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.43352601156069365,\n\
\ \"acc_stderr\": 0.03778621079092056,\n \"acc_norm\": 0.43352601156069365,\n\
\ \"acc_norm_stderr\": 0.03778621079092056\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.043364327079931785,\n\
\ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.043364327079931785\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n\
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4765957446808511,\n \"acc_stderr\": 0.032650194750335815,\n\
\ \"acc_norm\": 0.4765957446808511,\n \"acc_norm_stderr\": 0.032650194750335815\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n\
\ \"acc_stderr\": 0.04303684033537315,\n \"acc_norm\": 0.2982456140350877,\n\
\ \"acc_norm_stderr\": 0.04303684033537315\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.47586206896551725,\n \"acc_stderr\": 0.041618085035015295,\n\
\ \"acc_norm\": 0.47586206896551725,\n \"acc_norm_stderr\": 0.041618085035015295\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2962962962962963,\n \"acc_stderr\": 0.023517294335963286,\n \"\
acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.023517294335963286\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.04006168083848878,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.04006168083848878\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5645161290322581,\n\
\ \"acc_stderr\": 0.028206225591502744,\n \"acc_norm\": 0.5645161290322581,\n\
\ \"acc_norm_stderr\": 0.028206225591502744\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.35960591133004927,\n \"acc_stderr\": 0.03376458246509567,\n\
\ \"acc_norm\": 0.35960591133004927,\n \"acc_norm_stderr\": 0.03376458246509567\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\"\
: 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.03663974994391245,\n\
\ \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.03663974994391245\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6515151515151515,\n \"acc_stderr\": 0.03394853965156402,\n \"\
acc_norm\": 0.6515151515151515,\n \"acc_norm_stderr\": 0.03394853965156402\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7357512953367875,\n \"acc_stderr\": 0.03182155050916647,\n\
\ \"acc_norm\": 0.7357512953367875,\n \"acc_norm_stderr\": 0.03182155050916647\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4641025641025641,\n \"acc_stderr\": 0.025285585990017838,\n\
\ \"acc_norm\": 0.4641025641025641,\n \"acc_norm_stderr\": 0.025285585990017838\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26666666666666666,\n \"acc_stderr\": 0.026962424325073855,\n \
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.026962424325073855\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.47058823529411764,\n \"acc_stderr\": 0.03242225027115006,\n\
\ \"acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.03242225027115006\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526733,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526733\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7064220183486238,\n \"acc_stderr\": 0.019525151122639667,\n \"\
acc_norm\": 0.7064220183486238,\n \"acc_norm_stderr\": 0.019525151122639667\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.375,\n \"acc_stderr\": 0.033016908987210894,\n \"acc_norm\": 0.375,\n\
\ \"acc_norm_stderr\": 0.033016908987210894\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.03242661719827218,\n\
\ \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.03242661719827218\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7172995780590717,\n \"acc_stderr\": 0.02931281415395592,\n \
\ \"acc_norm\": 0.7172995780590717,\n \"acc_norm_stderr\": 0.02931281415395592\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5560538116591929,\n\
\ \"acc_stderr\": 0.03334625674242728,\n \"acc_norm\": 0.5560538116591929,\n\
\ \"acc_norm_stderr\": 0.03334625674242728\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5954198473282443,\n \"acc_stderr\": 0.043046937953806645,\n\
\ \"acc_norm\": 0.5954198473282443,\n \"acc_norm_stderr\": 0.043046937953806645\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6528925619834711,\n \"acc_stderr\": 0.04345724570292534,\n \"\
acc_norm\": 0.6528925619834711,\n \"acc_norm_stderr\": 0.04345724570292534\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6018518518518519,\n\
\ \"acc_stderr\": 0.04732332615978814,\n \"acc_norm\": 0.6018518518518519,\n\
\ \"acc_norm_stderr\": 0.04732332615978814\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6012269938650306,\n \"acc_stderr\": 0.03847021420456023,\n\
\ \"acc_norm\": 0.6012269938650306,\n \"acc_norm_stderr\": 0.03847021420456023\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.36607142857142855,\n\
\ \"acc_stderr\": 0.0457237235873743,\n \"acc_norm\": 0.36607142857142855,\n\
\ \"acc_norm_stderr\": 0.0457237235873743\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7087378640776699,\n \"acc_stderr\": 0.04498676320572924,\n\
\ \"acc_norm\": 0.7087378640776699,\n \"acc_norm_stderr\": 0.04498676320572924\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.027236013946196694,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.027236013946196694\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7151979565772669,\n\
\ \"acc_stderr\": 0.016139174096522574,\n \"acc_norm\": 0.7151979565772669,\n\
\ \"acc_norm_stderr\": 0.016139174096522574\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5780346820809249,\n \"acc_stderr\": 0.026589231142174267,\n\
\ \"acc_norm\": 0.5780346820809249,\n \"acc_norm_stderr\": 0.026589231142174267\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2748603351955307,\n\
\ \"acc_stderr\": 0.014931316703220501,\n \"acc_norm\": 0.2748603351955307,\n\
\ \"acc_norm_stderr\": 0.014931316703220501\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5392156862745098,\n \"acc_stderr\": 0.028541722692618874,\n\
\ \"acc_norm\": 0.5392156862745098,\n \"acc_norm_stderr\": 0.028541722692618874\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6270096463022508,\n\
\ \"acc_stderr\": 0.0274666102131401,\n \"acc_norm\": 0.6270096463022508,\n\
\ \"acc_norm_stderr\": 0.0274666102131401\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5648148148148148,\n \"acc_stderr\": 0.02758600622160771,\n\
\ \"acc_norm\": 0.5648148148148148,\n \"acc_norm_stderr\": 0.02758600622160771\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.41843971631205673,\n \"acc_stderr\": 0.02942799403941999,\n \
\ \"acc_norm\": 0.41843971631205673,\n \"acc_norm_stderr\": 0.02942799403941999\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3754889178617992,\n\
\ \"acc_stderr\": 0.01236794539672821,\n \"acc_norm\": 0.3754889178617992,\n\
\ \"acc_norm_stderr\": 0.01236794539672821\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.46691176470588236,\n \"acc_stderr\": 0.030306257722468317,\n\
\ \"acc_norm\": 0.46691176470588236,\n \"acc_norm_stderr\": 0.030306257722468317\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5163398692810458,\n \"acc_stderr\": 0.02021703065318646,\n \
\ \"acc_norm\": 0.5163398692810458,\n \"acc_norm_stderr\": 0.02021703065318646\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5909090909090909,\n\
\ \"acc_stderr\": 0.04709306978661896,\n \"acc_norm\": 0.5909090909090909,\n\
\ \"acc_norm_stderr\": 0.04709306978661896\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5836734693877551,\n \"acc_stderr\": 0.03155782816556164,\n\
\ \"acc_norm\": 0.5836734693877551,\n \"acc_norm_stderr\": 0.03155782816556164\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6417910447761194,\n\
\ \"acc_stderr\": 0.03390393042268814,\n \"acc_norm\": 0.6417910447761194,\n\
\ \"acc_norm_stderr\": 0.03390393042268814\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3855421686746988,\n\
\ \"acc_stderr\": 0.037891344246115496,\n \"acc_norm\": 0.3855421686746988,\n\
\ \"acc_norm_stderr\": 0.037891344246115496\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.695906432748538,\n \"acc_stderr\": 0.03528211258245229,\n\
\ \"acc_norm\": 0.695906432748538,\n \"acc_norm_stderr\": 0.03528211258245229\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3084455324357405,\n\
\ \"mc1_stderr\": 0.01616803938315687,\n \"mc2\": 0.46277766867323933,\n\
\ \"mc2_stderr\": 0.015164855129314675\n }\n}\n```"
repo_url: https://huggingface.co/LTC-AI-Labs/L2-7b-Base-test-WVG
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|arc:challenge|25_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hellaswag|10_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T17-19-09.186622.parquet'
- config_name: results
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- results_2023-10-03T17-19-09.186622.parquet
- split: latest
path:
- results_2023-10-03T17-19-09.186622.parquet
---
# Dataset Card for Evaluation run of LTC-AI-Labs/L2-7b-Base-test-WVG
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/LTC-AI-Labs/L2-7b-Base-test-WVG
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [LTC-AI-Labs/L2-7b-Base-test-WVG](https://huggingface.co/LTC-AI-Labs/L2-7b-Base-test-WVG) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_LTC-AI-Labs__L2-7b-Base-test-WVG",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-03T17:19:09.186622](https://huggingface.co/datasets/open-llm-leaderboard/details_LTC-AI-Labs__L2-7b-Base-test-WVG/blob/main/results_2023-10-03T17-19-09.186622.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5122076677774635,
"acc_stderr": 0.034888652826956947,
"acc_norm": 0.5157490760401192,
"acc_norm_stderr": 0.0348750803996461,
"mc1": 0.3084455324357405,
"mc1_stderr": 0.01616803938315687,
"mc2": 0.46277766867323933,
"mc2_stderr": 0.015164855129314675
},
"harness|arc:challenge|25": {
"acc": 0.5213310580204779,
"acc_stderr": 0.014598087973127104,
"acc_norm": 0.5426621160409556,
"acc_norm_stderr": 0.014558106543924063
},
"harness|hellaswag|10": {
"acc": 0.5905198167695678,
"acc_stderr": 0.004907329270272705,
"acc_norm": 0.7781318462457678,
"acc_norm_stderr": 0.004146537488135695
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.04063302731486671,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.04063302731486671
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5735849056603773,
"acc_stderr": 0.030437794342983052,
"acc_norm": 0.5735849056603773,
"acc_norm_stderr": 0.030437794342983052
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5486111111111112,
"acc_stderr": 0.04161402398403279,
"acc_norm": 0.5486111111111112,
"acc_norm_stderr": 0.04161402398403279
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.43352601156069365,
"acc_stderr": 0.03778621079092056,
"acc_norm": 0.43352601156069365,
"acc_norm_stderr": 0.03778621079092056
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.043364327079931785,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.043364327079931785
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4765957446808511,
"acc_stderr": 0.032650194750335815,
"acc_norm": 0.4765957446808511,
"acc_norm_stderr": 0.032650194750335815
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.04303684033537315,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.04303684033537315
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.47586206896551725,
"acc_stderr": 0.041618085035015295,
"acc_norm": 0.47586206896551725,
"acc_norm_stderr": 0.041618085035015295
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.023517294335963286,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.023517294335963286
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.04006168083848878,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.04006168083848878
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5645161290322581,
"acc_stderr": 0.028206225591502744,
"acc_norm": 0.5645161290322581,
"acc_norm_stderr": 0.028206225591502744
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.35960591133004927,
"acc_stderr": 0.03376458246509567,
"acc_norm": 0.35960591133004927,
"acc_norm_stderr": 0.03376458246509567
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.03663974994391245,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.03663974994391245
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6515151515151515,
"acc_stderr": 0.03394853965156402,
"acc_norm": 0.6515151515151515,
"acc_norm_stderr": 0.03394853965156402
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7357512953367875,
"acc_stderr": 0.03182155050916647,
"acc_norm": 0.7357512953367875,
"acc_norm_stderr": 0.03182155050916647
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4641025641025641,
"acc_stderr": 0.025285585990017838,
"acc_norm": 0.4641025641025641,
"acc_norm_stderr": 0.025285585990017838
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.026962424325073855,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.026962424325073855
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.47058823529411764,
"acc_stderr": 0.03242225027115006,
"acc_norm": 0.47058823529411764,
"acc_norm_stderr": 0.03242225027115006
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526733,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526733
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7064220183486238,
"acc_stderr": 0.019525151122639667,
"acc_norm": 0.7064220183486238,
"acc_norm_stderr": 0.019525151122639667
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.375,
"acc_stderr": 0.033016908987210894,
"acc_norm": 0.375,
"acc_norm_stderr": 0.033016908987210894
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6911764705882353,
"acc_stderr": 0.03242661719827218,
"acc_norm": 0.6911764705882353,
"acc_norm_stderr": 0.03242661719827218
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7172995780590717,
"acc_stderr": 0.02931281415395592,
"acc_norm": 0.7172995780590717,
"acc_norm_stderr": 0.02931281415395592
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5560538116591929,
"acc_stderr": 0.03334625674242728,
"acc_norm": 0.5560538116591929,
"acc_norm_stderr": 0.03334625674242728
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5954198473282443,
"acc_stderr": 0.043046937953806645,
"acc_norm": 0.5954198473282443,
"acc_norm_stderr": 0.043046937953806645
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6528925619834711,
"acc_stderr": 0.04345724570292534,
"acc_norm": 0.6528925619834711,
"acc_norm_stderr": 0.04345724570292534
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6018518518518519,
"acc_stderr": 0.04732332615978814,
"acc_norm": 0.6018518518518519,
"acc_norm_stderr": 0.04732332615978814
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6012269938650306,
"acc_stderr": 0.03847021420456023,
"acc_norm": 0.6012269938650306,
"acc_norm_stderr": 0.03847021420456023
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.36607142857142855,
"acc_stderr": 0.0457237235873743,
"acc_norm": 0.36607142857142855,
"acc_norm_stderr": 0.0457237235873743
},
"harness|hendrycksTest-management|5": {
"acc": 0.7087378640776699,
"acc_stderr": 0.04498676320572924,
"acc_norm": 0.7087378640776699,
"acc_norm_stderr": 0.04498676320572924
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.027236013946196694,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.027236013946196694
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7151979565772669,
"acc_stderr": 0.016139174096522574,
"acc_norm": 0.7151979565772669,
"acc_norm_stderr": 0.016139174096522574
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5780346820809249,
"acc_stderr": 0.026589231142174267,
"acc_norm": 0.5780346820809249,
"acc_norm_stderr": 0.026589231142174267
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2748603351955307,
"acc_stderr": 0.014931316703220501,
"acc_norm": 0.2748603351955307,
"acc_norm_stderr": 0.014931316703220501
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5392156862745098,
"acc_stderr": 0.028541722692618874,
"acc_norm": 0.5392156862745098,
"acc_norm_stderr": 0.028541722692618874
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6270096463022508,
"acc_stderr": 0.0274666102131401,
"acc_norm": 0.6270096463022508,
"acc_norm_stderr": 0.0274666102131401
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5648148148148148,
"acc_stderr": 0.02758600622160771,
"acc_norm": 0.5648148148148148,
"acc_norm_stderr": 0.02758600622160771
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.41843971631205673,
"acc_stderr": 0.02942799403941999,
"acc_norm": 0.41843971631205673,
"acc_norm_stderr": 0.02942799403941999
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3754889178617992,
"acc_stderr": 0.01236794539672821,
"acc_norm": 0.3754889178617992,
"acc_norm_stderr": 0.01236794539672821
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.46691176470588236,
"acc_stderr": 0.030306257722468317,
"acc_norm": 0.46691176470588236,
"acc_norm_stderr": 0.030306257722468317
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5163398692810458,
"acc_stderr": 0.02021703065318646,
"acc_norm": 0.5163398692810458,
"acc_norm_stderr": 0.02021703065318646
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5909090909090909,
"acc_stderr": 0.04709306978661896,
"acc_norm": 0.5909090909090909,
"acc_norm_stderr": 0.04709306978661896
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5836734693877551,
"acc_stderr": 0.03155782816556164,
"acc_norm": 0.5836734693877551,
"acc_norm_stderr": 0.03155782816556164
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6417910447761194,
"acc_stderr": 0.03390393042268814,
"acc_norm": 0.6417910447761194,
"acc_norm_stderr": 0.03390393042268814
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3855421686746988,
"acc_stderr": 0.037891344246115496,
"acc_norm": 0.3855421686746988,
"acc_norm_stderr": 0.037891344246115496
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.695906432748538,
"acc_stderr": 0.03528211258245229,
"acc_norm": 0.695906432748538,
"acc_norm_stderr": 0.03528211258245229
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3084455324357405,
"mc1_stderr": 0.01616803938315687,
"mc2": 0.46277766867323933,
"mc2_stderr": 0.015164855129314675
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_ajibawa-2023__Uncensored-Frank-33B | 2023-10-03T17:31:25.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of ajibawa-2023/Uncensored-Frank-33B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ajibawa-2023/Uncensored-Frank-33B](https://huggingface.co/ajibawa-2023/Uncensored-Frank-33B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ajibawa-2023__Uncensored-Frank-33B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-03T17:30:05.303429](https://huggingface.co/datasets/open-llm-leaderboard/details_ajibawa-2023__Uncensored-Frank-33B/blob/main/results_2023-10-03T17-30-05.303429.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5771983984504457,\n\
\ \"acc_stderr\": 0.03408414099021468,\n \"acc_norm\": 0.58085507719039,\n\
\ \"acc_norm_stderr\": 0.0340637124300013,\n \"mc1\": 0.3769889840881273,\n\
\ \"mc1_stderr\": 0.01696551757893035,\n \"mc2\": 0.5403351544458733,\n\
\ \"mc2_stderr\": 0.01610728856944345\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6023890784982935,\n \"acc_stderr\": 0.014301752223279547,\n\
\ \"acc_norm\": 0.621160409556314,\n \"acc_norm_stderr\": 0.014175915490000326\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6360286795459071,\n\
\ \"acc_stderr\": 0.004801572028920798,\n \"acc_norm\": 0.8330013941445927,\n\
\ \"acc_norm_stderr\": 0.0037221237096104623\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720683,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720683\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5037037037037037,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.5037037037037037,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5986842105263158,\n \"acc_stderr\": 0.039889037033362836,\n\
\ \"acc_norm\": 0.5986842105263158,\n \"acc_norm_stderr\": 0.039889037033362836\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03015113445777629,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03015113445777629\n },\n\
\ \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5972222222222222,\n\
\ \"acc_stderr\": 0.04101405519842426,\n \"acc_norm\": 0.5972222222222222,\n\
\ \"acc_norm_stderr\": 0.04101405519842426\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n\
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5260115606936416,\n\
\ \"acc_stderr\": 0.03807301726504513,\n \"acc_norm\": 0.5260115606936416,\n\
\ \"acc_norm_stderr\": 0.03807301726504513\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.044405219061793275,\n\
\ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.044405219061793275\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n\
\ \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5234042553191489,\n \"acc_stderr\": 0.032650194750335815,\n\
\ \"acc_norm\": 0.5234042553191489,\n \"acc_norm_stderr\": 0.032650194750335815\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.34210526315789475,\n\
\ \"acc_stderr\": 0.04462917535336936,\n \"acc_norm\": 0.34210526315789475,\n\
\ \"acc_norm_stderr\": 0.04462917535336936\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.04164188720169377,\n\
\ \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.04164188720169377\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.35714285714285715,\n \"acc_stderr\": 0.02467786284133278,\n \"\
acc_norm\": 0.35714285714285715,\n \"acc_norm_stderr\": 0.02467786284133278\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04285714285714281,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04285714285714281\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6645161290322581,\n\
\ \"acc_stderr\": 0.02686020644472435,\n \"acc_norm\": 0.6645161290322581,\n\
\ \"acc_norm_stderr\": 0.02686020644472435\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3448275862068966,\n \"acc_stderr\": 0.03344283744280458,\n\
\ \"acc_norm\": 0.3448275862068966,\n \"acc_norm_stderr\": 0.03344283744280458\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939098,\n \"acc_norm\"\
: 0.63,\n \"acc_norm_stderr\": 0.04852365870939098\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.696969696969697,\n \"acc_stderr\": 0.03588624800091707,\n\
\ \"acc_norm\": 0.696969696969697,\n \"acc_norm_stderr\": 0.03588624800091707\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7525252525252525,\n \"acc_stderr\": 0.030746300742124498,\n \"\
acc_norm\": 0.7525252525252525,\n \"acc_norm_stderr\": 0.030746300742124498\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8031088082901554,\n \"acc_stderr\": 0.028697873971860688,\n\
\ \"acc_norm\": 0.8031088082901554,\n \"acc_norm_stderr\": 0.028697873971860688\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5538461538461539,\n \"acc_stderr\": 0.02520357177302833,\n \
\ \"acc_norm\": 0.5538461538461539,\n \"acc_norm_stderr\": 0.02520357177302833\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2851851851851852,\n \"acc_stderr\": 0.027528599210340492,\n \
\ \"acc_norm\": 0.2851851851851852,\n \"acc_norm_stderr\": 0.027528599210340492\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5798319327731093,\n \"acc_stderr\": 0.03206183783236153,\n \
\ \"acc_norm\": 0.5798319327731093,\n \"acc_norm_stderr\": 0.03206183783236153\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7779816513761468,\n \"acc_stderr\": 0.017818849564796634,\n \"\
acc_norm\": 0.7779816513761468,\n \"acc_norm_stderr\": 0.017818849564796634\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.39351851851851855,\n \"acc_stderr\": 0.03331747876370312,\n \"\
acc_norm\": 0.39351851851851855,\n \"acc_norm_stderr\": 0.03331747876370312\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7647058823529411,\n \"acc_stderr\": 0.02977177522814563,\n \"\
acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.02977177522814563\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \
\ \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6793893129770993,\n \"acc_stderr\": 0.04093329229834278,\n\
\ \"acc_norm\": 0.6793893129770993,\n \"acc_norm_stderr\": 0.04093329229834278\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\"\
: 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6944444444444444,\n\
\ \"acc_stderr\": 0.044531975073749834,\n \"acc_norm\": 0.6944444444444444,\n\
\ \"acc_norm_stderr\": 0.044531975073749834\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6993865030674846,\n \"acc_stderr\": 0.03602511318806771,\n\
\ \"acc_norm\": 0.6993865030674846,\n \"acc_norm_stderr\": 0.03602511318806771\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.04726835553719099,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.04726835553719099\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.022509033937077802,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.022509033937077802\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7624521072796935,\n\
\ \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.7624521072796935,\n\
\ \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6445086705202312,\n \"acc_stderr\": 0.025770292082977247,\n\
\ \"acc_norm\": 0.6445086705202312,\n \"acc_norm_stderr\": 0.025770292082977247\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3039106145251397,\n\
\ \"acc_stderr\": 0.01538284558758451,\n \"acc_norm\": 0.3039106145251397,\n\
\ \"acc_norm_stderr\": 0.01538284558758451\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6241830065359477,\n \"acc_stderr\": 0.02773283435336394,\n\
\ \"acc_norm\": 0.6241830065359477,\n \"acc_norm_stderr\": 0.02773283435336394\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\
\ \"acc_stderr\": 0.02600330111788514,\n \"acc_norm\": 0.7009646302250804,\n\
\ \"acc_norm_stderr\": 0.02600330111788514\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6604938271604939,\n \"acc_stderr\": 0.026348564412011624,\n\
\ \"acc_norm\": 0.6604938271604939,\n \"acc_norm_stderr\": 0.026348564412011624\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.44680851063829785,\n \"acc_stderr\": 0.029658235097666904,\n \
\ \"acc_norm\": 0.44680851063829785,\n \"acc_norm_stderr\": 0.029658235097666904\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45436766623207303,\n\
\ \"acc_stderr\": 0.0127169417207348,\n \"acc_norm\": 0.45436766623207303,\n\
\ \"acc_norm_stderr\": 0.0127169417207348\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5514705882352942,\n \"acc_stderr\": 0.0302114796091216,\n\
\ \"acc_norm\": 0.5514705882352942,\n \"acc_norm_stderr\": 0.0302114796091216\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6143790849673203,\n \"acc_stderr\": 0.01969145905235404,\n \
\ \"acc_norm\": 0.6143790849673203,\n \"acc_norm_stderr\": 0.01969145905235404\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.636734693877551,\n \"acc_stderr\": 0.030789051139030806,\n\
\ \"acc_norm\": 0.636734693877551,\n \"acc_norm_stderr\": 0.030789051139030806\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7761194029850746,\n\
\ \"acc_stderr\": 0.0294752502360172,\n \"acc_norm\": 0.7761194029850746,\n\
\ \"acc_norm_stderr\": 0.0294752502360172\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n\
\ \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n\
\ \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.030944459778533193,\n\
\ \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.030944459778533193\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3769889840881273,\n\
\ \"mc1_stderr\": 0.01696551757893035,\n \"mc2\": 0.5403351544458733,\n\
\ \"mc2_stderr\": 0.01610728856944345\n }\n}\n```"
repo_url: https://huggingface.co/ajibawa-2023/Uncensored-Frank-33B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|arc:challenge|25_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hellaswag|10_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T17-30-05.303429.parquet'
- config_name: results
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- results_2023-10-03T17-30-05.303429.parquet
- split: latest
path:
- results_2023-10-03T17-30-05.303429.parquet
---
# Dataset Card for Evaluation run of ajibawa-2023/Uncensored-Frank-33B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/ajibawa-2023/Uncensored-Frank-33B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [ajibawa-2023/Uncensored-Frank-33B](https://huggingface.co/ajibawa-2023/Uncensored-Frank-33B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ajibawa-2023__Uncensored-Frank-33B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-03T17:30:05.303429](https://huggingface.co/datasets/open-llm-leaderboard/details_ajibawa-2023__Uncensored-Frank-33B/blob/main/results_2023-10-03T17-30-05.303429.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5771983984504457,
"acc_stderr": 0.03408414099021468,
"acc_norm": 0.58085507719039,
"acc_norm_stderr": 0.0340637124300013,
"mc1": 0.3769889840881273,
"mc1_stderr": 0.01696551757893035,
"mc2": 0.5403351544458733,
"mc2_stderr": 0.01610728856944345
},
"harness|arc:challenge|25": {
"acc": 0.6023890784982935,
"acc_stderr": 0.014301752223279547,
"acc_norm": 0.621160409556314,
"acc_norm_stderr": 0.014175915490000326
},
"harness|hellaswag|10": {
"acc": 0.6360286795459071,
"acc_stderr": 0.004801572028920798,
"acc_norm": 0.8330013941445927,
"acc_norm_stderr": 0.0037221237096104623
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720683,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720683
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5037037037037037,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.5037037037037037,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5986842105263158,
"acc_stderr": 0.039889037033362836,
"acc_norm": 0.5986842105263158,
"acc_norm_stderr": 0.039889037033362836
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6,
"acc_stderr": 0.03015113445777629,
"acc_norm": 0.6,
"acc_norm_stderr": 0.03015113445777629
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5972222222222222,
"acc_stderr": 0.04101405519842426,
"acc_norm": 0.5972222222222222,
"acc_norm_stderr": 0.04101405519842426
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5260115606936416,
"acc_stderr": 0.03807301726504513,
"acc_norm": 0.5260115606936416,
"acc_norm_stderr": 0.03807301726504513
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.044405219061793275,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.044405219061793275
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5234042553191489,
"acc_stderr": 0.032650194750335815,
"acc_norm": 0.5234042553191489,
"acc_norm_stderr": 0.032650194750335815
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.34210526315789475,
"acc_stderr": 0.04462917535336936,
"acc_norm": 0.34210526315789475,
"acc_norm_stderr": 0.04462917535336936
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.04164188720169377,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.04164188720169377
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.02467786284133278,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.02467786284133278
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04285714285714281,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04285714285714281
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6645161290322581,
"acc_stderr": 0.02686020644472435,
"acc_norm": 0.6645161290322581,
"acc_norm_stderr": 0.02686020644472435
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3448275862068966,
"acc_stderr": 0.03344283744280458,
"acc_norm": 0.3448275862068966,
"acc_norm_stderr": 0.03344283744280458
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939098,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939098
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.696969696969697,
"acc_stderr": 0.03588624800091707,
"acc_norm": 0.696969696969697,
"acc_norm_stderr": 0.03588624800091707
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7525252525252525,
"acc_stderr": 0.030746300742124498,
"acc_norm": 0.7525252525252525,
"acc_norm_stderr": 0.030746300742124498
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8031088082901554,
"acc_stderr": 0.028697873971860688,
"acc_norm": 0.8031088082901554,
"acc_norm_stderr": 0.028697873971860688
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5538461538461539,
"acc_stderr": 0.02520357177302833,
"acc_norm": 0.5538461538461539,
"acc_norm_stderr": 0.02520357177302833
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2851851851851852,
"acc_stderr": 0.027528599210340492,
"acc_norm": 0.2851851851851852,
"acc_norm_stderr": 0.027528599210340492
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5798319327731093,
"acc_stderr": 0.03206183783236153,
"acc_norm": 0.5798319327731093,
"acc_norm_stderr": 0.03206183783236153
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7779816513761468,
"acc_stderr": 0.017818849564796634,
"acc_norm": 0.7779816513761468,
"acc_norm_stderr": 0.017818849564796634
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.39351851851851855,
"acc_stderr": 0.03331747876370312,
"acc_norm": 0.39351851851851855,
"acc_norm_stderr": 0.03331747876370312
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.02977177522814563,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.02977177522814563
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7805907172995781,
"acc_stderr": 0.026939106581553945,
"acc_norm": 0.7805907172995781,
"acc_norm_stderr": 0.026939106581553945
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.03138147637575499,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.03138147637575499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6793893129770993,
"acc_stderr": 0.04093329229834278,
"acc_norm": 0.6793893129770993,
"acc_norm_stderr": 0.04093329229834278
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302872,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302872
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.044531975073749834,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.044531975073749834
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6993865030674846,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.6993865030674846,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.04726835553719099,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.04726835553719099
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077802,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077802
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7624521072796935,
"acc_stderr": 0.015218733046150193,
"acc_norm": 0.7624521072796935,
"acc_norm_stderr": 0.015218733046150193
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6445086705202312,
"acc_stderr": 0.025770292082977247,
"acc_norm": 0.6445086705202312,
"acc_norm_stderr": 0.025770292082977247
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3039106145251397,
"acc_stderr": 0.01538284558758451,
"acc_norm": 0.3039106145251397,
"acc_norm_stderr": 0.01538284558758451
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6241830065359477,
"acc_stderr": 0.02773283435336394,
"acc_norm": 0.6241830065359477,
"acc_norm_stderr": 0.02773283435336394
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.02600330111788514,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.02600330111788514
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6604938271604939,
"acc_stderr": 0.026348564412011624,
"acc_norm": 0.6604938271604939,
"acc_norm_stderr": 0.026348564412011624
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.44680851063829785,
"acc_stderr": 0.029658235097666904,
"acc_norm": 0.44680851063829785,
"acc_norm_stderr": 0.029658235097666904
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45436766623207303,
"acc_stderr": 0.0127169417207348,
"acc_norm": 0.45436766623207303,
"acc_norm_stderr": 0.0127169417207348
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5514705882352942,
"acc_stderr": 0.0302114796091216,
"acc_norm": 0.5514705882352942,
"acc_norm_stderr": 0.0302114796091216
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6143790849673203,
"acc_stderr": 0.01969145905235404,
"acc_norm": 0.6143790849673203,
"acc_norm_stderr": 0.01969145905235404
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.636734693877551,
"acc_stderr": 0.030789051139030806,
"acc_norm": 0.636734693877551,
"acc_norm_stderr": 0.030789051139030806
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7761194029850746,
"acc_stderr": 0.0294752502360172,
"acc_norm": 0.7761194029850746,
"acc_norm_stderr": 0.0294752502360172
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835817,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835817
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7953216374269005,
"acc_stderr": 0.030944459778533193,
"acc_norm": 0.7953216374269005,
"acc_norm_stderr": 0.030944459778533193
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3769889840881273,
"mc1_stderr": 0.01696551757893035,
"mc2": 0.5403351544458733,
"mc2_stderr": 0.01610728856944345
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_Undi95__Emerald-13B | 2023-10-03T17:32:47.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Undi95/Emerald-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Undi95/Emerald-13B](https://huggingface.co/Undi95/Emerald-13B) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Undi95__Emerald-13B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-03T17:31:23.265550](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__Emerald-13B/blob/main/results_2023-10-03T17-31-23.265550.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5588193039950157,\n\
\ \"acc_stderr\": 0.03445212545677957,\n \"acc_norm\": 0.5628177704578535,\n\
\ \"acc_norm_stderr\": 0.0344293961660333,\n \"mc1\": 0.36107711138310894,\n\
\ \"mc1_stderr\": 0.016814312844836886,\n \"mc2\": 0.5094365067991387,\n\
\ \"mc2_stderr\": 0.015354293715350336\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5853242320819113,\n \"acc_stderr\": 0.014397070564409172,\n\
\ \"acc_norm\": 0.6228668941979523,\n \"acc_norm_stderr\": 0.014163366896192601\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6385182234614618,\n\
\ \"acc_stderr\": 0.0047944784263826085,\n \"acc_norm\": 0.836885082652858,\n\
\ \"acc_norm_stderr\": 0.0036871539405687963\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5259259259259259,\n\
\ \"acc_stderr\": 0.04313531696750575,\n \"acc_norm\": 0.5259259259259259,\n\
\ \"acc_norm_stderr\": 0.04313531696750575\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5394736842105263,\n \"acc_stderr\": 0.04056242252249034,\n\
\ \"acc_norm\": 0.5394736842105263,\n \"acc_norm_stderr\": 0.04056242252249034\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5660377358490566,\n \"acc_stderr\": 0.030503292013342592,\n\
\ \"acc_norm\": 0.5660377358490566,\n \"acc_norm_stderr\": 0.030503292013342592\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5902777777777778,\n\
\ \"acc_stderr\": 0.04112490974670788,\n \"acc_norm\": 0.5902777777777778,\n\
\ \"acc_norm_stderr\": 0.04112490974670788\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5260115606936416,\n\
\ \"acc_stderr\": 0.038073017265045125,\n \"acc_norm\": 0.5260115606936416,\n\
\ \"acc_norm_stderr\": 0.038073017265045125\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171452,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171452\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.04560480215720685,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.04560480215720685\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.46808510638297873,\n \"acc_stderr\": 0.03261936918467381,\n\
\ \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.03261936918467381\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.044346007015849245,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.044346007015849245\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.503448275862069,\n \"acc_stderr\": 0.041665675771015785,\n\
\ \"acc_norm\": 0.503448275862069,\n \"acc_norm_stderr\": 0.041665675771015785\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3333333333333333,\n \"acc_stderr\": 0.0242785680243077,\n \"acc_norm\"\
: 0.3333333333333333,\n \"acc_norm_stderr\": 0.0242785680243077\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n\
\ \"acc_stderr\": 0.04343525428949098,\n \"acc_norm\": 0.38095238095238093,\n\
\ \"acc_norm_stderr\": 0.04343525428949098\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.632258064516129,\n\
\ \"acc_stderr\": 0.027430866579973463,\n \"acc_norm\": 0.632258064516129,\n\
\ \"acc_norm_stderr\": 0.027430866579973463\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.43349753694581283,\n \"acc_stderr\": 0.03486731727419872,\n\
\ \"acc_norm\": 0.43349753694581283,\n \"acc_norm_stderr\": 0.03486731727419872\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.03663974994391244,\n\
\ \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.03663974994391244\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7070707070707071,\n \"acc_stderr\": 0.032424979581788166,\n \"\
acc_norm\": 0.7070707070707071,\n \"acc_norm_stderr\": 0.032424979581788166\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.772020725388601,\n \"acc_stderr\": 0.030276909945178274,\n\
\ \"acc_norm\": 0.772020725388601,\n \"acc_norm_stderr\": 0.030276909945178274\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5128205128205128,\n \"acc_stderr\": 0.02534267129380725,\n \
\ \"acc_norm\": 0.5128205128205128,\n \"acc_norm_stderr\": 0.02534267129380725\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5840336134453782,\n \"acc_stderr\": 0.032016501007396114,\n\
\ \"acc_norm\": 0.5840336134453782,\n \"acc_norm_stderr\": 0.032016501007396114\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.03757949922943342,\n \"acc_norm\"\
: 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943342\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7376146788990826,\n\
\ \"acc_stderr\": 0.01886188502153473,\n \"acc_norm\": 0.7376146788990826,\n\
\ \"acc_norm_stderr\": 0.01886188502153473\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.37962962962962965,\n \"acc_stderr\": 0.03309682581119035,\n\
\ \"acc_norm\": 0.37962962962962965,\n \"acc_norm_stderr\": 0.03309682581119035\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7745098039215687,\n \"acc_stderr\": 0.02933116229425174,\n \"\
acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.02933116229425174\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7721518987341772,\n \"acc_stderr\": 0.02730348459906943,\n \
\ \"acc_norm\": 0.7721518987341772,\n \"acc_norm_stderr\": 0.02730348459906943\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.648854961832061,\n \"acc_stderr\": 0.04186445163013751,\n\
\ \"acc_norm\": 0.648854961832061,\n \"acc_norm_stderr\": 0.04186445163013751\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516304,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516304\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7129629629629629,\n\
\ \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.7129629629629629,\n\
\ \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6871165644171779,\n \"acc_stderr\": 0.036429145782924055,\n\
\ \"acc_norm\": 0.6871165644171779,\n \"acc_norm_stderr\": 0.036429145782924055\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n\
\ \"acc_stderr\": 0.04464285714285714,\n \"acc_norm\": 0.33035714285714285,\n\
\ \"acc_norm_stderr\": 0.04464285714285714\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6796116504854369,\n \"acc_stderr\": 0.04620284082280042,\n\
\ \"acc_norm\": 0.6796116504854369,\n \"acc_norm_stderr\": 0.04620284082280042\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.782051282051282,\n\
\ \"acc_stderr\": 0.027046857630716677,\n \"acc_norm\": 0.782051282051282,\n\
\ \"acc_norm_stderr\": 0.027046857630716677\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.55,\n \"acc_stderr\": 0.04999999999999999,\n \
\ \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.04999999999999999\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7547892720306514,\n\
\ \"acc_stderr\": 0.015384352284543944,\n \"acc_norm\": 0.7547892720306514,\n\
\ \"acc_norm_stderr\": 0.015384352284543944\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6358381502890174,\n \"acc_stderr\": 0.02590663263101613,\n\
\ \"acc_norm\": 0.6358381502890174,\n \"acc_norm_stderr\": 0.02590663263101613\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4122905027932961,\n\
\ \"acc_stderr\": 0.016463200238114525,\n \"acc_norm\": 0.4122905027932961,\n\
\ \"acc_norm_stderr\": 0.016463200238114525\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6241830065359477,\n \"acc_stderr\": 0.027732834353363933,\n\
\ \"acc_norm\": 0.6241830065359477,\n \"acc_norm_stderr\": 0.027732834353363933\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.639871382636656,\n\
\ \"acc_stderr\": 0.02726429759980401,\n \"acc_norm\": 0.639871382636656,\n\
\ \"acc_norm_stderr\": 0.02726429759980401\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.02686949074481525,\n\
\ \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.02686949074481525\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.41134751773049644,\n \"acc_stderr\": 0.029354911159940985,\n \
\ \"acc_norm\": 0.41134751773049644,\n \"acc_norm_stderr\": 0.029354911159940985\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42633637548891784,\n\
\ \"acc_stderr\": 0.01263088477159969,\n \"acc_norm\": 0.42633637548891784,\n\
\ \"acc_norm_stderr\": 0.01263088477159969\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5183823529411765,\n \"acc_stderr\": 0.030352303395351964,\n\
\ \"acc_norm\": 0.5183823529411765,\n \"acc_norm_stderr\": 0.030352303395351964\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5702614379084967,\n \"acc_stderr\": 0.02002712278492855,\n \
\ \"acc_norm\": 0.5702614379084967,\n \"acc_norm_stderr\": 0.02002712278492855\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5918367346938775,\n \"acc_stderr\": 0.03146465712827424,\n\
\ \"acc_norm\": 0.5918367346938775,\n \"acc_norm_stderr\": 0.03146465712827424\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7064676616915423,\n\
\ \"acc_stderr\": 0.032200241045342054,\n \"acc_norm\": 0.7064676616915423,\n\
\ \"acc_norm_stderr\": 0.032200241045342054\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.463855421686747,\n\
\ \"acc_stderr\": 0.03882310850890593,\n \"acc_norm\": 0.463855421686747,\n\
\ \"acc_norm_stderr\": 0.03882310850890593\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.031267817146631786,\n\
\ \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.031267817146631786\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.36107711138310894,\n\
\ \"mc1_stderr\": 0.016814312844836886,\n \"mc2\": 0.5094365067991387,\n\
\ \"mc2_stderr\": 0.015354293715350336\n }\n}\n```"
repo_url: https://huggingface.co/Undi95/Emerald-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|arc:challenge|25_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hellaswag|10_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T17-31-23.265550.parquet'
- config_name: results
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- results_2023-10-03T17-31-23.265550.parquet
- split: latest
path:
- results_2023-10-03T17-31-23.265550.parquet
---
# Dataset Card for Evaluation run of Undi95/Emerald-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Undi95/Emerald-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Undi95/Emerald-13B](https://huggingface.co/Undi95/Emerald-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Undi95__Emerald-13B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-03T17:31:23.265550](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__Emerald-13B/blob/main/results_2023-10-03T17-31-23.265550.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5588193039950157,
"acc_stderr": 0.03445212545677957,
"acc_norm": 0.5628177704578535,
"acc_norm_stderr": 0.0344293961660333,
"mc1": 0.36107711138310894,
"mc1_stderr": 0.016814312844836886,
"mc2": 0.5094365067991387,
"mc2_stderr": 0.015354293715350336
},
"harness|arc:challenge|25": {
"acc": 0.5853242320819113,
"acc_stderr": 0.014397070564409172,
"acc_norm": 0.6228668941979523,
"acc_norm_stderr": 0.014163366896192601
},
"harness|hellaswag|10": {
"acc": 0.6385182234614618,
"acc_stderr": 0.0047944784263826085,
"acc_norm": 0.836885082652858,
"acc_norm_stderr": 0.0036871539405687963
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5259259259259259,
"acc_stderr": 0.04313531696750575,
"acc_norm": 0.5259259259259259,
"acc_norm_stderr": 0.04313531696750575
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5394736842105263,
"acc_stderr": 0.04056242252249034,
"acc_norm": 0.5394736842105263,
"acc_norm_stderr": 0.04056242252249034
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5660377358490566,
"acc_stderr": 0.030503292013342592,
"acc_norm": 0.5660377358490566,
"acc_norm_stderr": 0.030503292013342592
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5902777777777778,
"acc_stderr": 0.04112490974670788,
"acc_norm": 0.5902777777777778,
"acc_norm_stderr": 0.04112490974670788
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5260115606936416,
"acc_stderr": 0.038073017265045125,
"acc_norm": 0.5260115606936416,
"acc_norm_stderr": 0.038073017265045125
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.04220773659171452,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.04220773659171452
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720685,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720685
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.03261936918467381,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.03261936918467381
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.044346007015849245,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.044346007015849245
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.503448275862069,
"acc_stderr": 0.041665675771015785,
"acc_norm": 0.503448275862069,
"acc_norm_stderr": 0.041665675771015785
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.0242785680243077,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.0242785680243077
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.04343525428949098,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.04343525428949098
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.632258064516129,
"acc_stderr": 0.027430866579973463,
"acc_norm": 0.632258064516129,
"acc_norm_stderr": 0.027430866579973463
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.43349753694581283,
"acc_stderr": 0.03486731727419872,
"acc_norm": 0.43349753694581283,
"acc_norm_stderr": 0.03486731727419872
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.03663974994391244,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.03663974994391244
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7070707070707071,
"acc_stderr": 0.032424979581788166,
"acc_norm": 0.7070707070707071,
"acc_norm_stderr": 0.032424979581788166
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.772020725388601,
"acc_stderr": 0.030276909945178274,
"acc_norm": 0.772020725388601,
"acc_norm_stderr": 0.030276909945178274
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5128205128205128,
"acc_stderr": 0.02534267129380725,
"acc_norm": 0.5128205128205128,
"acc_norm_stderr": 0.02534267129380725
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5840336134453782,
"acc_stderr": 0.032016501007396114,
"acc_norm": 0.5840336134453782,
"acc_norm_stderr": 0.032016501007396114
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943342,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943342
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7376146788990826,
"acc_stderr": 0.01886188502153473,
"acc_norm": 0.7376146788990826,
"acc_norm_stderr": 0.01886188502153473
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.37962962962962965,
"acc_stderr": 0.03309682581119035,
"acc_norm": 0.37962962962962965,
"acc_norm_stderr": 0.03309682581119035
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7745098039215687,
"acc_stderr": 0.02933116229425174,
"acc_norm": 0.7745098039215687,
"acc_norm_stderr": 0.02933116229425174
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7721518987341772,
"acc_stderr": 0.02730348459906943,
"acc_norm": 0.7721518987341772,
"acc_norm_stderr": 0.02730348459906943
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.648854961832061,
"acc_stderr": 0.04186445163013751,
"acc_norm": 0.648854961832061,
"acc_norm_stderr": 0.04186445163013751
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516304,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516304
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6871165644171779,
"acc_stderr": 0.036429145782924055,
"acc_norm": 0.6871165644171779,
"acc_norm_stderr": 0.036429145782924055
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.33035714285714285,
"acc_stderr": 0.04464285714285714,
"acc_norm": 0.33035714285714285,
"acc_norm_stderr": 0.04464285714285714
},
"harness|hendrycksTest-management|5": {
"acc": 0.6796116504854369,
"acc_stderr": 0.04620284082280042,
"acc_norm": 0.6796116504854369,
"acc_norm_stderr": 0.04620284082280042
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.782051282051282,
"acc_stderr": 0.027046857630716677,
"acc_norm": 0.782051282051282,
"acc_norm_stderr": 0.027046857630716677
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.55,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.55,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7547892720306514,
"acc_stderr": 0.015384352284543944,
"acc_norm": 0.7547892720306514,
"acc_norm_stderr": 0.015384352284543944
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.02590663263101613,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.02590663263101613
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4122905027932961,
"acc_stderr": 0.016463200238114525,
"acc_norm": 0.4122905027932961,
"acc_norm_stderr": 0.016463200238114525
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6241830065359477,
"acc_stderr": 0.027732834353363933,
"acc_norm": 0.6241830065359477,
"acc_norm_stderr": 0.027732834353363933
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.639871382636656,
"acc_stderr": 0.02726429759980401,
"acc_norm": 0.639871382636656,
"acc_norm_stderr": 0.02726429759980401
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.02686949074481525,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.02686949074481525
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.41134751773049644,
"acc_stderr": 0.029354911159940985,
"acc_norm": 0.41134751773049644,
"acc_norm_stderr": 0.029354911159940985
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.42633637548891784,
"acc_stderr": 0.01263088477159969,
"acc_norm": 0.42633637548891784,
"acc_norm_stderr": 0.01263088477159969
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5183823529411765,
"acc_stderr": 0.030352303395351964,
"acc_norm": 0.5183823529411765,
"acc_norm_stderr": 0.030352303395351964
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5702614379084967,
"acc_stderr": 0.02002712278492855,
"acc_norm": 0.5702614379084967,
"acc_norm_stderr": 0.02002712278492855
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5918367346938775,
"acc_stderr": 0.03146465712827424,
"acc_norm": 0.5918367346938775,
"acc_norm_stderr": 0.03146465712827424
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7064676616915423,
"acc_stderr": 0.032200241045342054,
"acc_norm": 0.7064676616915423,
"acc_norm_stderr": 0.032200241045342054
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-virology|5": {
"acc": 0.463855421686747,
"acc_stderr": 0.03882310850890593,
"acc_norm": 0.463855421686747,
"acc_norm_stderr": 0.03882310850890593
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.031267817146631786,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.031267817146631786
},
"harness|truthfulqa:mc|0": {
"mc1": 0.36107711138310894,
"mc1_stderr": 0.016814312844836886,
"mc2": 0.5094365067991387,
"mc2_stderr": 0.015354293715350336
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_Undi95__MXLewd-L2-20B | 2023-10-03T17:33:42.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Undi95/MXLewd-L2-20B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Undi95/MXLewd-L2-20B](https://huggingface.co/Undi95/MXLewd-L2-20B) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Undi95__MXLewd-L2-20B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-03T17:32:13.142085](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__MXLewd-L2-20B/blob/main/results_2023-10-03T17-32-13.142085.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5756106760468652,\n\
\ \"acc_stderr\": 0.034247535703005975,\n \"acc_norm\": 0.5793670546336733,\n\
\ \"acc_norm_stderr\": 0.03422343071424788,\n \"mc1\": 0.35128518971848227,\n\
\ \"mc1_stderr\": 0.016711358163544403,\n \"mc2\": 0.5164709446147603,\n\
\ \"mc2_stderr\": 0.015892065045890465\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5938566552901023,\n \"acc_stderr\": 0.014351656690097863,\n\
\ \"acc_norm\": 0.6322525597269625,\n \"acc_norm_stderr\": 0.014090995618168477\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6700856403106951,\n\
\ \"acc_stderr\": 0.004692208279690597,\n \"acc_norm\": 0.8533160724955188,\n\
\ \"acc_norm_stderr\": 0.0035306750148923053\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4962962962962963,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.4962962962962963,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5460526315789473,\n \"acc_stderr\": 0.04051646342874142,\n\
\ \"acc_norm\": 0.5460526315789473,\n \"acc_norm_stderr\": 0.04051646342874142\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6339622641509434,\n \"acc_stderr\": 0.029647813539365245,\n\
\ \"acc_norm\": 0.6339622641509434,\n \"acc_norm_stderr\": 0.029647813539365245\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6736111111111112,\n\
\ \"acc_stderr\": 0.03921067198982266,\n \"acc_norm\": 0.6736111111111112,\n\
\ \"acc_norm_stderr\": 0.03921067198982266\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n\
\ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5317919075144508,\n\
\ \"acc_stderr\": 0.038047497443647646,\n \"acc_norm\": 0.5317919075144508,\n\
\ \"acc_norm_stderr\": 0.038047497443647646\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006716,\n\
\ \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006716\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4765957446808511,\n \"acc_stderr\": 0.03265019475033582,\n\
\ \"acc_norm\": 0.4765957446808511,\n \"acc_norm_stderr\": 0.03265019475033582\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n\
\ \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3492063492063492,\n \"acc_stderr\": 0.024552292209342647,\n \"\
acc_norm\": 0.3492063492063492,\n \"acc_norm_stderr\": 0.024552292209342647\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n\
\ \"acc_stderr\": 0.04306241259127153,\n \"acc_norm\": 0.36507936507936506,\n\
\ \"acc_norm_stderr\": 0.04306241259127153\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.667741935483871,\n \"acc_stderr\": 0.0267955608481228,\n \"acc_norm\"\
: 0.667741935483871,\n \"acc_norm_stderr\": 0.0267955608481228\n },\n\
\ \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4827586206896552,\n\
\ \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.4827586206896552,\n\
\ \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\"\
: {\n \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \
\ \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n \
\ },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \
\ \"acc\": 0.6848484848484848,\n \"acc_stderr\": 0.0362773057502241,\n \
\ \"acc_norm\": 0.6848484848484848,\n \"acc_norm_stderr\": 0.0362773057502241\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7222222222222222,\n \"acc_stderr\": 0.03191178226713546,\n \"\
acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.03191178226713546\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7979274611398963,\n \"acc_stderr\": 0.028979089794296732,\n\
\ \"acc_norm\": 0.7979274611398963,\n \"acc_norm_stderr\": 0.028979089794296732\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5641025641025641,\n \"acc_stderr\": 0.025141801511177495,\n\
\ \"acc_norm\": 0.5641025641025641,\n \"acc_norm_stderr\": 0.025141801511177495\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131147,\n \
\ \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131147\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6134453781512605,\n \"acc_stderr\": 0.03163145807552379,\n \
\ \"acc_norm\": 0.6134453781512605,\n \"acc_norm_stderr\": 0.03163145807552379\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.744954128440367,\n \"acc_stderr\": 0.01868850085653584,\n \"acc_norm\"\
: 0.744954128440367,\n \"acc_norm_stderr\": 0.01868850085653584\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.029771775228145617,\n\
\ \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.029771775228145617\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7637130801687764,\n \"acc_stderr\": 0.02765215314415927,\n \
\ \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.02765215314415927\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.03114679648297246,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.03114679648297246\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6259541984732825,\n \"acc_stderr\": 0.042438692422305246,\n\
\ \"acc_norm\": 0.6259541984732825,\n \"acc_norm_stderr\": 0.042438692422305246\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7355371900826446,\n \"acc_stderr\": 0.04026187527591207,\n \"\
acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.04026187527591207\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.04330043749650742,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.04330043749650742\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6809815950920245,\n \"acc_stderr\": 0.03661997551073836,\n\
\ \"acc_norm\": 0.6809815950920245,\n \"acc_norm_stderr\": 0.03661997551073836\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.32142857142857145,\n\
\ \"acc_stderr\": 0.044328040552915185,\n \"acc_norm\": 0.32142857142857145,\n\
\ \"acc_norm_stderr\": 0.044328040552915185\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7087378640776699,\n \"acc_stderr\": 0.044986763205729224,\n\
\ \"acc_norm\": 0.7087378640776699,\n \"acc_norm_stderr\": 0.044986763205729224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8418803418803419,\n\
\ \"acc_stderr\": 0.02390232554956041,\n \"acc_norm\": 0.8418803418803419,\n\
\ \"acc_norm_stderr\": 0.02390232554956041\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7726692209450831,\n\
\ \"acc_stderr\": 0.014987270640946007,\n \"acc_norm\": 0.7726692209450831,\n\
\ \"acc_norm_stderr\": 0.014987270640946007\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.02541600377316554,\n\
\ \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.02541600377316554\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4100558659217877,\n\
\ \"acc_stderr\": 0.016449708209026078,\n \"acc_norm\": 0.4100558659217877,\n\
\ \"acc_norm_stderr\": 0.016449708209026078\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.630718954248366,\n \"acc_stderr\": 0.02763417668960266,\n\
\ \"acc_norm\": 0.630718954248366,\n \"acc_norm_stderr\": 0.02763417668960266\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6720257234726688,\n\
\ \"acc_stderr\": 0.026664410886937617,\n \"acc_norm\": 0.6720257234726688,\n\
\ \"acc_norm_stderr\": 0.026664410886937617\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6944444444444444,\n \"acc_stderr\": 0.025630824975621355,\n\
\ \"acc_norm\": 0.6944444444444444,\n \"acc_norm_stderr\": 0.025630824975621355\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.44680851063829785,\n \"acc_stderr\": 0.0296582350976669,\n \
\ \"acc_norm\": 0.44680851063829785,\n \"acc_norm_stderr\": 0.0296582350976669\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.43089960886571055,\n\
\ \"acc_stderr\": 0.012647695889547235,\n \"acc_norm\": 0.43089960886571055,\n\
\ \"acc_norm_stderr\": 0.012647695889547235\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5992647058823529,\n \"acc_stderr\": 0.02976826352893311,\n\
\ \"acc_norm\": 0.5992647058823529,\n \"acc_norm_stderr\": 0.02976826352893311\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5996732026143791,\n \"acc_stderr\": 0.019821843688271765,\n \
\ \"acc_norm\": 0.5996732026143791,\n \"acc_norm_stderr\": 0.019821843688271765\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5454545454545454,\n\
\ \"acc_stderr\": 0.04769300568972744,\n \"acc_norm\": 0.5454545454545454,\n\
\ \"acc_norm_stderr\": 0.04769300568972744\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.636734693877551,\n \"acc_stderr\": 0.030789051139030806,\n\
\ \"acc_norm\": 0.636734693877551,\n \"acc_norm_stderr\": 0.030789051139030806\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7661691542288557,\n\
\ \"acc_stderr\": 0.02992941540834839,\n \"acc_norm\": 0.7661691542288557,\n\
\ \"acc_norm_stderr\": 0.02992941540834839\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.034873508801977704,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.034873508801977704\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4578313253012048,\n\
\ \"acc_stderr\": 0.038786267710023595,\n \"acc_norm\": 0.4578313253012048,\n\
\ \"acc_norm_stderr\": 0.038786267710023595\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7485380116959064,\n \"acc_stderr\": 0.033275044238468436,\n\
\ \"acc_norm\": 0.7485380116959064,\n \"acc_norm_stderr\": 0.033275044238468436\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.35128518971848227,\n\
\ \"mc1_stderr\": 0.016711358163544403,\n \"mc2\": 0.5164709446147603,\n\
\ \"mc2_stderr\": 0.015892065045890465\n }\n}\n```"
repo_url: https://huggingface.co/Undi95/MXLewd-L2-20B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_03T17_32_13.142085
path:
- '**/details_harness|arc:challenge|25_2023-10-03T17-32-13.142085.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-03T17-32-13.142085.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_03T17_32_13.142085
path:
- '**/details_harness|hellaswag|10_2023-10-03T17-32-13.142085.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-03T17-32-13.142085.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_03T17_32_13.142085
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-32-13.142085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-32-13.142085.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-32-13.142085.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_03T17_32_13.142085
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-32-13.142085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-32-13.142085.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_03T17_32_13.142085
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-32-13.142085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-32-13.142085.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_03T17_32_13.142085
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-32-13.142085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-32-13.142085.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_03T17_32_13.142085
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-32-13.142085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-32-13.142085.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_03T17_32_13.142085
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-32-13.142085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-32-13.142085.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_03T17_32_13.142085
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-32-13.142085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-32-13.142085.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_03T17_32_13.142085
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-32-13.142085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-32-13.142085.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_03T17_32_13.142085
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-32-13.142085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-32-13.142085.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_03T17_32_13.142085
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-32-13.142085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-32-13.142085.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_03T17_32_13.142085
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-32-13.142085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-32-13.142085.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_03T17_32_13.142085
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-32-13.142085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-32-13.142085.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_03T17_32_13.142085
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-32-13.142085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-32-13.142085.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_03T17_32_13.142085
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-32-13.142085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-32-13.142085.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_03T17_32_13.142085
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-32-13.142085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-32-13.142085.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_03T17_32_13.142085
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-32-13.142085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-32-13.142085.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_03T17_32_13.142085
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-32-13.142085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-32-13.142085.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_03T17_32_13.142085
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-32-13.142085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-32-13.142085.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_03T17_32_13.142085
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-32-13.142085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-32-13.142085.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_03T17_32_13.142085
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-32-13.142085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-32-13.142085.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_03T17_32_13.142085
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-32-13.142085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-32-13.142085.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_03T17_32_13.142085
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-32-13.142085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-32-13.142085.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_03T17_32_13.142085
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-32-13.142085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-32-13.142085.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_03T17_32_13.142085
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-32-13.142085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-32-13.142085.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_03T17_32_13.142085
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-32-13.142085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-32-13.142085.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_03T17_32_13.142085
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-32-13.142085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-32-13.142085.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_03T17_32_13.142085
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-32-13.142085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-32-13.142085.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_03T17_32_13.142085
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-32-13.142085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-32-13.142085.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_03T17_32_13.142085
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-32-13.142085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-32-13.142085.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_03T17_32_13.142085
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-32-13.142085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-32-13.142085.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_03T17_32_13.142085
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-32-13.142085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-32-13.142085.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_03T17_32_13.142085
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-32-13.142085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-32-13.142085.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_03T17_32_13.142085
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-32-13.142085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-32-13.142085.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_03T17_32_13.142085
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-32-13.142085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-32-13.142085.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_03T17_32_13.142085
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-32-13.142085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-32-13.142085.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_03T17_32_13.142085
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-32-13.142085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-32-13.142085.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_03T17_32_13.142085
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-32-13.142085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-32-13.142085.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_03T17_32_13.142085
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-32-13.142085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-32-13.142085.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_03T17_32_13.142085
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-32-13.142085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-32-13.142085.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_03T17_32_13.142085
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T17-32-13.142085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T17-32-13.142085.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_03T17_32_13.142085
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-32-13.142085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-32-13.142085.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_03T17_32_13.142085
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-32-13.142085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-32-13.142085.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_03T17_32_13.142085
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-32-13.142085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-32-13.142085.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_03T17_32_13.142085
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-32-13.142085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-32-13.142085.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_03T17_32_13.142085
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-32-13.142085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-32-13.142085.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_03T17_32_13.142085
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-32-13.142085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-32-13.142085.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_03T17_32_13.142085
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-32-13.142085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-32-13.142085.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_03T17_32_13.142085
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-32-13.142085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-32-13.142085.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_03T17_32_13.142085
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-32-13.142085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-32-13.142085.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_03T17_32_13.142085
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-32-13.142085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-32-13.142085.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_03T17_32_13.142085
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-32-13.142085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-32-13.142085.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_03T17_32_13.142085
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-32-13.142085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-32-13.142085.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_03T17_32_13.142085
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-32-13.142085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-32-13.142085.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_03T17_32_13.142085
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-32-13.142085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-32-13.142085.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_03T17_32_13.142085
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-32-13.142085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-32-13.142085.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_03T17_32_13.142085
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-32-13.142085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-32-13.142085.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_03T17_32_13.142085
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-32-13.142085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-32-13.142085.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_03T17_32_13.142085
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-32-13.142085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-32-13.142085.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_03T17_32_13.142085
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T17-32-13.142085.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T17-32-13.142085.parquet'
- config_name: results
data_files:
- split: 2023_10_03T17_32_13.142085
path:
- results_2023-10-03T17-32-13.142085.parquet
- split: latest
path:
- results_2023-10-03T17-32-13.142085.parquet
---
# Dataset Card for Evaluation run of Undi95/MXLewd-L2-20B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Undi95/MXLewd-L2-20B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Undi95/MXLewd-L2-20B](https://huggingface.co/Undi95/MXLewd-L2-20B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Undi95__MXLewd-L2-20B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-03T17:32:13.142085](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__MXLewd-L2-20B/blob/main/results_2023-10-03T17-32-13.142085.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5756106760468652,
"acc_stderr": 0.034247535703005975,
"acc_norm": 0.5793670546336733,
"acc_norm_stderr": 0.03422343071424788,
"mc1": 0.35128518971848227,
"mc1_stderr": 0.016711358163544403,
"mc2": 0.5164709446147603,
"mc2_stderr": 0.015892065045890465
},
"harness|arc:challenge|25": {
"acc": 0.5938566552901023,
"acc_stderr": 0.014351656690097863,
"acc_norm": 0.6322525597269625,
"acc_norm_stderr": 0.014090995618168477
},
"harness|hellaswag|10": {
"acc": 0.6700856403106951,
"acc_stderr": 0.004692208279690597,
"acc_norm": 0.8533160724955188,
"acc_norm_stderr": 0.0035306750148923053
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4962962962962963,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.4962962962962963,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5460526315789473,
"acc_stderr": 0.04051646342874142,
"acc_norm": 0.5460526315789473,
"acc_norm_stderr": 0.04051646342874142
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6339622641509434,
"acc_stderr": 0.029647813539365245,
"acc_norm": 0.6339622641509434,
"acc_norm_stderr": 0.029647813539365245
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6736111111111112,
"acc_stderr": 0.03921067198982266,
"acc_norm": 0.6736111111111112,
"acc_norm_stderr": 0.03921067198982266
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5317919075144508,
"acc_stderr": 0.038047497443647646,
"acc_norm": 0.5317919075144508,
"acc_norm_stderr": 0.038047497443647646
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.04617034827006716,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.04617034827006716
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4765957446808511,
"acc_stderr": 0.03265019475033582,
"acc_norm": 0.4765957446808511,
"acc_norm_stderr": 0.03265019475033582
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159394,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159394
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3492063492063492,
"acc_stderr": 0.024552292209342647,
"acc_norm": 0.3492063492063492,
"acc_norm_stderr": 0.024552292209342647
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.04306241259127153,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.04306241259127153
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.667741935483871,
"acc_stderr": 0.0267955608481228,
"acc_norm": 0.667741935483871,
"acc_norm_stderr": 0.0267955608481228
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6848484848484848,
"acc_stderr": 0.0362773057502241,
"acc_norm": 0.6848484848484848,
"acc_norm_stderr": 0.0362773057502241
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.03191178226713546,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.03191178226713546
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7979274611398963,
"acc_stderr": 0.028979089794296732,
"acc_norm": 0.7979274611398963,
"acc_norm_stderr": 0.028979089794296732
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5641025641025641,
"acc_stderr": 0.025141801511177495,
"acc_norm": 0.5641025641025641,
"acc_norm_stderr": 0.025141801511177495
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131147,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131147
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6134453781512605,
"acc_stderr": 0.03163145807552379,
"acc_norm": 0.6134453781512605,
"acc_norm_stderr": 0.03163145807552379
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.744954128440367,
"acc_stderr": 0.01868850085653584,
"acc_norm": 0.744954128440367,
"acc_norm_stderr": 0.01868850085653584
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.029771775228145617,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.029771775228145617
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7637130801687764,
"acc_stderr": 0.02765215314415927,
"acc_norm": 0.7637130801687764,
"acc_norm_stderr": 0.02765215314415927
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.03114679648297246,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.03114679648297246
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6259541984732825,
"acc_stderr": 0.042438692422305246,
"acc_norm": 0.6259541984732825,
"acc_norm_stderr": 0.042438692422305246
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.04026187527591207,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.04026187527591207
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.04330043749650742,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.04330043749650742
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6809815950920245,
"acc_stderr": 0.03661997551073836,
"acc_norm": 0.6809815950920245,
"acc_norm_stderr": 0.03661997551073836
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.32142857142857145,
"acc_stderr": 0.044328040552915185,
"acc_norm": 0.32142857142857145,
"acc_norm_stderr": 0.044328040552915185
},
"harness|hendrycksTest-management|5": {
"acc": 0.7087378640776699,
"acc_stderr": 0.044986763205729224,
"acc_norm": 0.7087378640776699,
"acc_norm_stderr": 0.044986763205729224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8418803418803419,
"acc_stderr": 0.02390232554956041,
"acc_norm": 0.8418803418803419,
"acc_norm_stderr": 0.02390232554956041
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7726692209450831,
"acc_stderr": 0.014987270640946007,
"acc_norm": 0.7726692209450831,
"acc_norm_stderr": 0.014987270640946007
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.02541600377316554,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.02541600377316554
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4100558659217877,
"acc_stderr": 0.016449708209026078,
"acc_norm": 0.4100558659217877,
"acc_norm_stderr": 0.016449708209026078
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.630718954248366,
"acc_stderr": 0.02763417668960266,
"acc_norm": 0.630718954248366,
"acc_norm_stderr": 0.02763417668960266
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6720257234726688,
"acc_stderr": 0.026664410886937617,
"acc_norm": 0.6720257234726688,
"acc_norm_stderr": 0.026664410886937617
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.025630824975621355,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.025630824975621355
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.44680851063829785,
"acc_stderr": 0.0296582350976669,
"acc_norm": 0.44680851063829785,
"acc_norm_stderr": 0.0296582350976669
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.43089960886571055,
"acc_stderr": 0.012647695889547235,
"acc_norm": 0.43089960886571055,
"acc_norm_stderr": 0.012647695889547235
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5992647058823529,
"acc_stderr": 0.02976826352893311,
"acc_norm": 0.5992647058823529,
"acc_norm_stderr": 0.02976826352893311
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5996732026143791,
"acc_stderr": 0.019821843688271765,
"acc_norm": 0.5996732026143791,
"acc_norm_stderr": 0.019821843688271765
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5454545454545454,
"acc_stderr": 0.04769300568972744,
"acc_norm": 0.5454545454545454,
"acc_norm_stderr": 0.04769300568972744
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.636734693877551,
"acc_stderr": 0.030789051139030806,
"acc_norm": 0.636734693877551,
"acc_norm_stderr": 0.030789051139030806
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7661691542288557,
"acc_stderr": 0.02992941540834839,
"acc_norm": 0.7661691542288557,
"acc_norm_stderr": 0.02992941540834839
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.034873508801977704,
"acc_norm": 0.86,
"acc_norm_stderr": 0.034873508801977704
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4578313253012048,
"acc_stderr": 0.038786267710023595,
"acc_norm": 0.4578313253012048,
"acc_norm_stderr": 0.038786267710023595
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7485380116959064,
"acc_stderr": 0.033275044238468436,
"acc_norm": 0.7485380116959064,
"acc_norm_stderr": 0.033275044238468436
},
"harness|truthfulqa:mc|0": {
"mc1": 0.35128518971848227,
"mc1_stderr": 0.016711358163544403,
"mc2": 0.5164709446147603,
"mc2_stderr": 0.015892065045890465
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_Undi95__Amethyst-13B | 2023-10-03T17:39:03.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Undi95/Amethyst-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Undi95/Amethyst-13B](https://huggingface.co/Undi95/Amethyst-13B) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Undi95__Amethyst-13B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-03T17:37:36.187420](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__Amethyst-13B/blob/main/results_2023-10-03T17-37-36.187420.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5610110706771517,\n\
\ \"acc_stderr\": 0.0343909052667031,\n \"acc_norm\": 0.5648605475782799,\n\
\ \"acc_norm_stderr\": 0.03436914805487177,\n \"mc1\": 0.3684210526315789,\n\
\ \"mc1_stderr\": 0.016886551261046042,\n \"mc2\": 0.5242719773292807,\n\
\ \"mc2_stderr\": 0.015543122220738859\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5921501706484642,\n \"acc_stderr\": 0.014361097288449708,\n\
\ \"acc_norm\": 0.6262798634812287,\n \"acc_norm_stderr\": 0.01413770860175909\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6387173869747063,\n\
\ \"acc_stderr\": 0.004793904922401889,\n \"acc_norm\": 0.8317068313085043,\n\
\ \"acc_norm_stderr\": 0.003733618111043529\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621503,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621503\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5259259259259259,\n\
\ \"acc_stderr\": 0.04313531696750575,\n \"acc_norm\": 0.5259259259259259,\n\
\ \"acc_norm_stderr\": 0.04313531696750575\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5460526315789473,\n \"acc_stderr\": 0.04051646342874142,\n\
\ \"acc_norm\": 0.5460526315789473,\n \"acc_norm_stderr\": 0.04051646342874142\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.569811320754717,\n \"acc_stderr\": 0.030471445867183235,\n\
\ \"acc_norm\": 0.569811320754717,\n \"acc_norm_stderr\": 0.030471445867183235\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5763888888888888,\n\
\ \"acc_stderr\": 0.041321250197233685,\n \"acc_norm\": 0.5763888888888888,\n\
\ \"acc_norm_stderr\": 0.041321250197233685\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\"\
: 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5433526011560693,\n\
\ \"acc_stderr\": 0.03798106566014498,\n \"acc_norm\": 0.5433526011560693,\n\
\ \"acc_norm_stderr\": 0.03798106566014498\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.04280105837364395,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.04280105837364395\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.04560480215720685,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.04560480215720685\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4765957446808511,\n \"acc_stderr\": 0.032650194750335815,\n\
\ \"acc_norm\": 0.4765957446808511,\n \"acc_norm_stderr\": 0.032650194750335815\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.04142439719489361,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.04142439719489361\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.496551724137931,\n \"acc_stderr\": 0.041665675771015785,\n\
\ \"acc_norm\": 0.496551724137931,\n \"acc_norm_stderr\": 0.041665675771015785\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3253968253968254,\n \"acc_stderr\": 0.024130158299762613,\n \"\
acc_norm\": 0.3253968253968254,\n \"acc_norm_stderr\": 0.024130158299762613\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n\
\ \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n\
\ \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6548387096774193,\n\
\ \"acc_stderr\": 0.02704574657353433,\n \"acc_norm\": 0.6548387096774193,\n\
\ \"acc_norm_stderr\": 0.02704574657353433\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.41379310344827586,\n \"acc_stderr\": 0.03465304488406795,\n\
\ \"acc_norm\": 0.41379310344827586,\n \"acc_norm_stderr\": 0.03465304488406795\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n\
\ \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.036639749943912434,\n\
\ \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.036639749943912434\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7070707070707071,\n \"acc_stderr\": 0.032424979581788166,\n \"\
acc_norm\": 0.7070707070707071,\n \"acc_norm_stderr\": 0.032424979581788166\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7979274611398963,\n \"acc_stderr\": 0.028979089794296732,\n\
\ \"acc_norm\": 0.7979274611398963,\n \"acc_norm_stderr\": 0.028979089794296732\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5256410256410257,\n \"acc_stderr\": 0.025317649726448656,\n\
\ \"acc_norm\": 0.5256410256410257,\n \"acc_norm_stderr\": 0.025317649726448656\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3037037037037037,\n \"acc_stderr\": 0.028037929969114993,\n \
\ \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.028037929969114993\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.592436974789916,\n \"acc_stderr\": 0.031918633744784645,\n \
\ \"acc_norm\": 0.592436974789916,\n \"acc_norm_stderr\": 0.031918633744784645\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7357798165137615,\n \"acc_stderr\": 0.018904164171510193,\n \"\
acc_norm\": 0.7357798165137615,\n \"acc_norm_stderr\": 0.018904164171510193\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4074074074074074,\n \"acc_stderr\": 0.033509916046960415,\n \"\
acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.033509916046960415\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.75,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.7552742616033755,\n \"acc_stderr\": 0.027985699387036423,\n\
\ \"acc_norm\": 0.7552742616033755,\n \"acc_norm_stderr\": 0.027985699387036423\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057222,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057222\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.648854961832061,\n \"acc_stderr\": 0.04186445163013751,\n\
\ \"acc_norm\": 0.648854961832061,\n \"acc_norm_stderr\": 0.04186445163013751\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7272727272727273,\n \"acc_stderr\": 0.04065578140908706,\n \"\
acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04065578140908706\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.043300437496507416,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.043300437496507416\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6871165644171779,\n \"acc_stderr\": 0.036429145782924055,\n\
\ \"acc_norm\": 0.6871165644171779,\n \"acc_norm_stderr\": 0.036429145782924055\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3392857142857143,\n\
\ \"acc_stderr\": 0.04493949068613539,\n \"acc_norm\": 0.3392857142857143,\n\
\ \"acc_norm_stderr\": 0.04493949068613539\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6893203883495146,\n \"acc_stderr\": 0.045821241601615506,\n\
\ \"acc_norm\": 0.6893203883495146,\n \"acc_norm_stderr\": 0.045821241601615506\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7905982905982906,\n\
\ \"acc_stderr\": 0.026655699653922744,\n \"acc_norm\": 0.7905982905982906,\n\
\ \"acc_norm_stderr\": 0.026655699653922744\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7547892720306514,\n\
\ \"acc_stderr\": 0.015384352284543941,\n \"acc_norm\": 0.7547892720306514,\n\
\ \"acc_norm_stderr\": 0.015384352284543941\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6358381502890174,\n \"acc_stderr\": 0.025906632631016127,\n\
\ \"acc_norm\": 0.6358381502890174,\n \"acc_norm_stderr\": 0.025906632631016127\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4324022346368715,\n\
\ \"acc_stderr\": 0.01656897123354861,\n \"acc_norm\": 0.4324022346368715,\n\
\ \"acc_norm_stderr\": 0.01656897123354861\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.630718954248366,\n \"acc_stderr\": 0.02763417668960266,\n\
\ \"acc_norm\": 0.630718954248366,\n \"acc_norm_stderr\": 0.02763417668960266\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.639871382636656,\n\
\ \"acc_stderr\": 0.02726429759980401,\n \"acc_norm\": 0.639871382636656,\n\
\ \"acc_norm_stderr\": 0.02726429759980401\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6358024691358025,\n \"acc_stderr\": 0.02677492989972233,\n\
\ \"acc_norm\": 0.6358024691358025,\n \"acc_norm_stderr\": 0.02677492989972233\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4219858156028369,\n \"acc_stderr\": 0.029462189233370593,\n \
\ \"acc_norm\": 0.4219858156028369,\n \"acc_norm_stderr\": 0.029462189233370593\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42242503259452413,\n\
\ \"acc_stderr\": 0.012615600475734921,\n \"acc_norm\": 0.42242503259452413,\n\
\ \"acc_norm_stderr\": 0.012615600475734921\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5294117647058824,\n \"acc_stderr\": 0.03032024326500413,\n\
\ \"acc_norm\": 0.5294117647058824,\n \"acc_norm_stderr\": 0.03032024326500413\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5800653594771242,\n \"acc_stderr\": 0.01996681117825648,\n \
\ \"acc_norm\": 0.5800653594771242,\n \"acc_norm_stderr\": 0.01996681117825648\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.046075820907199756,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.046075820907199756\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6285714285714286,\n \"acc_stderr\": 0.030932858792789855,\n\
\ \"acc_norm\": 0.6285714285714286,\n \"acc_norm_stderr\": 0.030932858792789855\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7064676616915423,\n\
\ \"acc_stderr\": 0.032200241045342054,\n \"acc_norm\": 0.7064676616915423,\n\
\ \"acc_norm_stderr\": 0.032200241045342054\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n\
\ \"acc_stderr\": 0.038899512528272166,\n \"acc_norm\": 0.4819277108433735,\n\
\ \"acc_norm_stderr\": 0.038899512528272166\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.03094445977853321,\n\
\ \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.03094445977853321\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3684210526315789,\n\
\ \"mc1_stderr\": 0.016886551261046042,\n \"mc2\": 0.5242719773292807,\n\
\ \"mc2_stderr\": 0.015543122220738859\n }\n}\n```"
repo_url: https://huggingface.co/Undi95/Amethyst-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_03T17_37_36.187420
path:
- '**/details_harness|arc:challenge|25_2023-10-03T17-37-36.187420.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-03T17-37-36.187420.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_03T17_37_36.187420
path:
- '**/details_harness|hellaswag|10_2023-10-03T17-37-36.187420.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-03T17-37-36.187420.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_03T17_37_36.187420
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-37-36.187420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-37-36.187420.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-37-36.187420.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_03T17_37_36.187420
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-37-36.187420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-37-36.187420.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_03T17_37_36.187420
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-37-36.187420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-37-36.187420.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_03T17_37_36.187420
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-37-36.187420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-37-36.187420.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_03T17_37_36.187420
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-37-36.187420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-37-36.187420.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_03T17_37_36.187420
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-37-36.187420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-37-36.187420.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_03T17_37_36.187420
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-37-36.187420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-37-36.187420.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_03T17_37_36.187420
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-37-36.187420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-37-36.187420.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_03T17_37_36.187420
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-37-36.187420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-37-36.187420.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_03T17_37_36.187420
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-37-36.187420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-37-36.187420.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_03T17_37_36.187420
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-37-36.187420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-37-36.187420.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_03T17_37_36.187420
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-37-36.187420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-37-36.187420.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_03T17_37_36.187420
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-37-36.187420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-37-36.187420.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_03T17_37_36.187420
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-37-36.187420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-37-36.187420.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_03T17_37_36.187420
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-37-36.187420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-37-36.187420.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_03T17_37_36.187420
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-37-36.187420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-37-36.187420.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_03T17_37_36.187420
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-37-36.187420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-37-36.187420.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_03T17_37_36.187420
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-37-36.187420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-37-36.187420.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_03T17_37_36.187420
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-37-36.187420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-37-36.187420.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_03T17_37_36.187420
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-37-36.187420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-37-36.187420.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_03T17_37_36.187420
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-37-36.187420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-37-36.187420.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_03T17_37_36.187420
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-37-36.187420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-37-36.187420.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_03T17_37_36.187420
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-37-36.187420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-37-36.187420.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_03T17_37_36.187420
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-37-36.187420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-37-36.187420.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_03T17_37_36.187420
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-37-36.187420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-37-36.187420.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_03T17_37_36.187420
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-37-36.187420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-37-36.187420.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_03T17_37_36.187420
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-37-36.187420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-37-36.187420.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_03T17_37_36.187420
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-37-36.187420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-37-36.187420.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_03T17_37_36.187420
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-37-36.187420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-37-36.187420.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_03T17_37_36.187420
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-37-36.187420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-37-36.187420.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_03T17_37_36.187420
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-37-36.187420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-37-36.187420.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_03T17_37_36.187420
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-37-36.187420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-37-36.187420.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_03T17_37_36.187420
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-37-36.187420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-37-36.187420.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_03T17_37_36.187420
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-37-36.187420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-37-36.187420.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_03T17_37_36.187420
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-37-36.187420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-37-36.187420.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_03T17_37_36.187420
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-37-36.187420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-37-36.187420.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_03T17_37_36.187420
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-37-36.187420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-37-36.187420.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_03T17_37_36.187420
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-37-36.187420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-37-36.187420.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_03T17_37_36.187420
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-37-36.187420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-37-36.187420.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_03T17_37_36.187420
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T17-37-36.187420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T17-37-36.187420.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_03T17_37_36.187420
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-37-36.187420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-37-36.187420.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_03T17_37_36.187420
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-37-36.187420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-37-36.187420.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_03T17_37_36.187420
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-37-36.187420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-37-36.187420.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_03T17_37_36.187420
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-37-36.187420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-37-36.187420.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_03T17_37_36.187420
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-37-36.187420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-37-36.187420.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_03T17_37_36.187420
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-37-36.187420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-37-36.187420.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_03T17_37_36.187420
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-37-36.187420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-37-36.187420.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_03T17_37_36.187420
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-37-36.187420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-37-36.187420.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_03T17_37_36.187420
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-37-36.187420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-37-36.187420.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_03T17_37_36.187420
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-37-36.187420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-37-36.187420.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_03T17_37_36.187420
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-37-36.187420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-37-36.187420.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_03T17_37_36.187420
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-37-36.187420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-37-36.187420.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_03T17_37_36.187420
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-37-36.187420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-37-36.187420.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_03T17_37_36.187420
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-37-36.187420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-37-36.187420.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_03T17_37_36.187420
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-37-36.187420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-37-36.187420.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_03T17_37_36.187420
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-37-36.187420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-37-36.187420.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_03T17_37_36.187420
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-37-36.187420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-37-36.187420.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_03T17_37_36.187420
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-37-36.187420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-37-36.187420.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_03T17_37_36.187420
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T17-37-36.187420.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T17-37-36.187420.parquet'
- config_name: results
data_files:
- split: 2023_10_03T17_37_36.187420
path:
- results_2023-10-03T17-37-36.187420.parquet
- split: latest
path:
- results_2023-10-03T17-37-36.187420.parquet
---
# Dataset Card for Evaluation run of Undi95/Amethyst-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Undi95/Amethyst-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Undi95/Amethyst-13B](https://huggingface.co/Undi95/Amethyst-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Undi95__Amethyst-13B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-03T17:37:36.187420](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__Amethyst-13B/blob/main/results_2023-10-03T17-37-36.187420.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5610110706771517,
"acc_stderr": 0.0343909052667031,
"acc_norm": 0.5648605475782799,
"acc_norm_stderr": 0.03436914805487177,
"mc1": 0.3684210526315789,
"mc1_stderr": 0.016886551261046042,
"mc2": 0.5242719773292807,
"mc2_stderr": 0.015543122220738859
},
"harness|arc:challenge|25": {
"acc": 0.5921501706484642,
"acc_stderr": 0.014361097288449708,
"acc_norm": 0.6262798634812287,
"acc_norm_stderr": 0.01413770860175909
},
"harness|hellaswag|10": {
"acc": 0.6387173869747063,
"acc_stderr": 0.004793904922401889,
"acc_norm": 0.8317068313085043,
"acc_norm_stderr": 0.003733618111043529
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621503,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621503
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5259259259259259,
"acc_stderr": 0.04313531696750575,
"acc_norm": 0.5259259259259259,
"acc_norm_stderr": 0.04313531696750575
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5460526315789473,
"acc_stderr": 0.04051646342874142,
"acc_norm": 0.5460526315789473,
"acc_norm_stderr": 0.04051646342874142
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.569811320754717,
"acc_stderr": 0.030471445867183235,
"acc_norm": 0.569811320754717,
"acc_norm_stderr": 0.030471445867183235
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5763888888888888,
"acc_stderr": 0.041321250197233685,
"acc_norm": 0.5763888888888888,
"acc_norm_stderr": 0.041321250197233685
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5433526011560693,
"acc_stderr": 0.03798106566014498,
"acc_norm": 0.5433526011560693,
"acc_norm_stderr": 0.03798106566014498
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.04280105837364395,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.04280105837364395
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720685,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720685
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4765957446808511,
"acc_stderr": 0.032650194750335815,
"acc_norm": 0.4765957446808511,
"acc_norm_stderr": 0.032650194750335815
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.04142439719489361,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.04142439719489361
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.496551724137931,
"acc_stderr": 0.041665675771015785,
"acc_norm": 0.496551724137931,
"acc_norm_stderr": 0.041665675771015785
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3253968253968254,
"acc_stderr": 0.024130158299762613,
"acc_norm": 0.3253968253968254,
"acc_norm_stderr": 0.024130158299762613
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6548387096774193,
"acc_stderr": 0.02704574657353433,
"acc_norm": 0.6548387096774193,
"acc_norm_stderr": 0.02704574657353433
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.41379310344827586,
"acc_stderr": 0.03465304488406795,
"acc_norm": 0.41379310344827586,
"acc_norm_stderr": 0.03465304488406795
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.036639749943912434,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.036639749943912434
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7070707070707071,
"acc_stderr": 0.032424979581788166,
"acc_norm": 0.7070707070707071,
"acc_norm_stderr": 0.032424979581788166
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7979274611398963,
"acc_stderr": 0.028979089794296732,
"acc_norm": 0.7979274611398963,
"acc_norm_stderr": 0.028979089794296732
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5256410256410257,
"acc_stderr": 0.025317649726448656,
"acc_norm": 0.5256410256410257,
"acc_norm_stderr": 0.025317649726448656
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.028037929969114993,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.028037929969114993
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.592436974789916,
"acc_stderr": 0.031918633744784645,
"acc_norm": 0.592436974789916,
"acc_norm_stderr": 0.031918633744784645
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7357798165137615,
"acc_stderr": 0.018904164171510193,
"acc_norm": 0.7357798165137615,
"acc_norm_stderr": 0.018904164171510193
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.033509916046960415,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.033509916046960415
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.75,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7552742616033755,
"acc_stderr": 0.027985699387036423,
"acc_norm": 0.7552742616033755,
"acc_norm_stderr": 0.027985699387036423
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057222,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057222
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.648854961832061,
"acc_stderr": 0.04186445163013751,
"acc_norm": 0.648854961832061,
"acc_norm_stderr": 0.04186445163013751
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04065578140908706,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04065578140908706
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.043300437496507416,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.043300437496507416
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6871165644171779,
"acc_stderr": 0.036429145782924055,
"acc_norm": 0.6871165644171779,
"acc_norm_stderr": 0.036429145782924055
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3392857142857143,
"acc_stderr": 0.04493949068613539,
"acc_norm": 0.3392857142857143,
"acc_norm_stderr": 0.04493949068613539
},
"harness|hendrycksTest-management|5": {
"acc": 0.6893203883495146,
"acc_stderr": 0.045821241601615506,
"acc_norm": 0.6893203883495146,
"acc_norm_stderr": 0.045821241601615506
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7905982905982906,
"acc_stderr": 0.026655699653922744,
"acc_norm": 0.7905982905982906,
"acc_norm_stderr": 0.026655699653922744
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7547892720306514,
"acc_stderr": 0.015384352284543941,
"acc_norm": 0.7547892720306514,
"acc_norm_stderr": 0.015384352284543941
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.025906632631016127,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.025906632631016127
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4324022346368715,
"acc_stderr": 0.01656897123354861,
"acc_norm": 0.4324022346368715,
"acc_norm_stderr": 0.01656897123354861
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.630718954248366,
"acc_stderr": 0.02763417668960266,
"acc_norm": 0.630718954248366,
"acc_norm_stderr": 0.02763417668960266
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.639871382636656,
"acc_stderr": 0.02726429759980401,
"acc_norm": 0.639871382636656,
"acc_norm_stderr": 0.02726429759980401
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6358024691358025,
"acc_stderr": 0.02677492989972233,
"acc_norm": 0.6358024691358025,
"acc_norm_stderr": 0.02677492989972233
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4219858156028369,
"acc_stderr": 0.029462189233370593,
"acc_norm": 0.4219858156028369,
"acc_norm_stderr": 0.029462189233370593
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.42242503259452413,
"acc_stderr": 0.012615600475734921,
"acc_norm": 0.42242503259452413,
"acc_norm_stderr": 0.012615600475734921
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5294117647058824,
"acc_stderr": 0.03032024326500413,
"acc_norm": 0.5294117647058824,
"acc_norm_stderr": 0.03032024326500413
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5800653594771242,
"acc_stderr": 0.01996681117825648,
"acc_norm": 0.5800653594771242,
"acc_norm_stderr": 0.01996681117825648
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.046075820907199756,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.046075820907199756
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6285714285714286,
"acc_stderr": 0.030932858792789855,
"acc_norm": 0.6285714285714286,
"acc_norm_stderr": 0.030932858792789855
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7064676616915423,
"acc_stderr": 0.032200241045342054,
"acc_norm": 0.7064676616915423,
"acc_norm_stderr": 0.032200241045342054
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4819277108433735,
"acc_stderr": 0.038899512528272166,
"acc_norm": 0.4819277108433735,
"acc_norm_stderr": 0.038899512528272166
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7953216374269005,
"acc_stderr": 0.03094445977853321,
"acc_norm": 0.7953216374269005,
"acc_norm_stderr": 0.03094445977853321
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3684210526315789,
"mc1_stderr": 0.016886551261046042,
"mc2": 0.5242719773292807,
"mc2_stderr": 0.015543122220738859
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_Undi95__MM-ReMM-L2-20B | 2023-10-03T17:40:50.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Undi95/MM-ReMM-L2-20B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Undi95/MM-ReMM-L2-20B](https://huggingface.co/Undi95/MM-ReMM-L2-20B) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Undi95__MM-ReMM-L2-20B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-03T17:39:23.702108](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__MM-ReMM-L2-20B/blob/main/results_2023-10-03T17-39-23.702108.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5665465459002151,\n\
\ \"acc_stderr\": 0.034322884462850756,\n \"acc_norm\": 0.5701528202686406,\n\
\ \"acc_norm_stderr\": 0.034300794934459776,\n \"mc1\": 0.3708690330477356,\n\
\ \"mc1_stderr\": 0.016909693580248825,\n \"mc2\": 0.5333434257017081,\n\
\ \"mc2_stderr\": 0.015907207649223338\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5878839590443686,\n \"acc_stderr\": 0.014383915302225403,\n\
\ \"acc_norm\": 0.6083617747440273,\n \"acc_norm_stderr\": 0.014264122124938215\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6595299741087433,\n\
\ \"acc_stderr\": 0.004728988167338544,\n \"acc_norm\": 0.851822346146186,\n\
\ \"acc_norm_stderr\": 0.0035454991695580518\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411022,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411022\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4888888888888889,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.4888888888888889,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5460526315789473,\n \"acc_stderr\": 0.04051646342874142,\n\
\ \"acc_norm\": 0.5460526315789473,\n \"acc_norm_stderr\": 0.04051646342874142\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5886792452830188,\n \"acc_stderr\": 0.03028500925900979,\n\
\ \"acc_norm\": 0.5886792452830188,\n \"acc_norm_stderr\": 0.03028500925900979\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6458333333333334,\n\
\ \"acc_stderr\": 0.039994111357535424,\n \"acc_norm\": 0.6458333333333334,\n\
\ \"acc_norm_stderr\": 0.039994111357535424\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.45,\n \"acc_stderr\": 0.04999999999999999,\n \"acc_norm\": 0.45,\n\
\ \"acc_norm_stderr\": 0.04999999999999999\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5144508670520231,\n\
\ \"acc_stderr\": 0.03810871630454764,\n \"acc_norm\": 0.5144508670520231,\n\
\ \"acc_norm_stderr\": 0.03810871630454764\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201942,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201942\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.67,\n \"acc_stderr\": 0.047258156262526094,\n \"acc_norm\": 0.67,\n\
\ \"acc_norm_stderr\": 0.047258156262526094\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4723404255319149,\n \"acc_stderr\": 0.03263597118409769,\n\
\ \"acc_norm\": 0.4723404255319149,\n \"acc_norm_stderr\": 0.03263597118409769\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.22807017543859648,\n\
\ \"acc_stderr\": 0.03947152782669415,\n \"acc_norm\": 0.22807017543859648,\n\
\ \"acc_norm_stderr\": 0.03947152782669415\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.31746031746031744,\n \"acc_stderr\": 0.02397386199899208,\n \"\
acc_norm\": 0.31746031746031744,\n \"acc_norm_stderr\": 0.02397386199899208\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.042163702135578345,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.042163702135578345\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6580645161290323,\n\
\ \"acc_stderr\": 0.02698528957655274,\n \"acc_norm\": 0.6580645161290323,\n\
\ \"acc_norm_stderr\": 0.02698528957655274\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.03481904844438804,\n\
\ \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.03481904844438804\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.036085410115739666,\n\
\ \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.036085410115739666\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7272727272727273,\n \"acc_stderr\": 0.03173071239071724,\n \"\
acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.03173071239071724\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8134715025906736,\n \"acc_stderr\": 0.028112091210117478,\n\
\ \"acc_norm\": 0.8134715025906736,\n \"acc_norm_stderr\": 0.028112091210117478\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.558974358974359,\n \"acc_stderr\": 0.025174048384000745,\n \
\ \"acc_norm\": 0.558974358974359,\n \"acc_norm_stderr\": 0.025174048384000745\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028604,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028604\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6218487394957983,\n \"acc_stderr\": 0.031499305777849054,\n\
\ \"acc_norm\": 0.6218487394957983,\n \"acc_norm_stderr\": 0.031499305777849054\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7302752293577982,\n \"acc_stderr\": 0.01902848671111544,\n \"\
acc_norm\": 0.7302752293577982,\n \"acc_norm_stderr\": 0.01902848671111544\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.46296296296296297,\n \"acc_stderr\": 0.03400603625538271,\n \"\
acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.03400603625538271\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7843137254901961,\n \"acc_stderr\": 0.028867431449849313,\n \"\
acc_norm\": 0.7843137254901961,\n \"acc_norm_stderr\": 0.028867431449849313\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7468354430379747,\n \"acc_stderr\": 0.0283046579430353,\n \
\ \"acc_norm\": 0.7468354430379747,\n \"acc_norm_stderr\": 0.0283046579430353\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6636771300448431,\n\
\ \"acc_stderr\": 0.031708824268455,\n \"acc_norm\": 0.6636771300448431,\n\
\ \"acc_norm_stderr\": 0.031708824268455\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.648854961832061,\n \"acc_stderr\": 0.04186445163013751,\n\
\ \"acc_norm\": 0.648854961832061,\n \"acc_norm_stderr\": 0.04186445163013751\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7355371900826446,\n \"acc_stderr\": 0.040261875275912073,\n \"\
acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.040261875275912073\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6944444444444444,\n\
\ \"acc_stderr\": 0.04453197507374983,\n \"acc_norm\": 0.6944444444444444,\n\
\ \"acc_norm_stderr\": 0.04453197507374983\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6748466257668712,\n \"acc_stderr\": 0.036803503712864616,\n\
\ \"acc_norm\": 0.6748466257668712,\n \"acc_norm_stderr\": 0.036803503712864616\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n\
\ \"acc_stderr\": 0.04364226155841044,\n \"acc_norm\": 0.30357142857142855,\n\
\ \"acc_norm_stderr\": 0.04364226155841044\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6601941747572816,\n \"acc_stderr\": 0.046897659372781335,\n\
\ \"acc_norm\": 0.6601941747572816,\n \"acc_norm_stderr\": 0.046897659372781335\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8290598290598291,\n\
\ \"acc_stderr\": 0.024662496845209814,\n \"acc_norm\": 0.8290598290598291,\n\
\ \"acc_norm_stderr\": 0.024662496845209814\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7535121328224776,\n\
\ \"acc_stderr\": 0.015411308769686933,\n \"acc_norm\": 0.7535121328224776,\n\
\ \"acc_norm_stderr\": 0.015411308769686933\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6560693641618497,\n \"acc_stderr\": 0.02557412378654667,\n\
\ \"acc_norm\": 0.6560693641618497,\n \"acc_norm_stderr\": 0.02557412378654667\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3776536312849162,\n\
\ \"acc_stderr\": 0.016214148752136632,\n \"acc_norm\": 0.3776536312849162,\n\
\ \"acc_norm_stderr\": 0.016214148752136632\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.02736359328468496,\n\
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.02736359328468496\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6688102893890675,\n\
\ \"acc_stderr\": 0.02673062072800491,\n \"acc_norm\": 0.6688102893890675,\n\
\ \"acc_norm_stderr\": 0.02673062072800491\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6790123456790124,\n \"acc_stderr\": 0.02597656601086274,\n\
\ \"acc_norm\": 0.6790123456790124,\n \"acc_norm_stderr\": 0.02597656601086274\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4397163120567376,\n \"acc_stderr\": 0.029609912075594106,\n \
\ \"acc_norm\": 0.4397163120567376,\n \"acc_norm_stderr\": 0.029609912075594106\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45697522816166886,\n\
\ \"acc_stderr\": 0.012722869501611419,\n \"acc_norm\": 0.45697522816166886,\n\
\ \"acc_norm_stderr\": 0.012722869501611419\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5551470588235294,\n \"acc_stderr\": 0.030187532060329383,\n\
\ \"acc_norm\": 0.5551470588235294,\n \"acc_norm_stderr\": 0.030187532060329383\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5882352941176471,\n \"acc_stderr\": 0.019910377463105932,\n \
\ \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.019910377463105932\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n\
\ \"acc_stderr\": 0.046534298079135075,\n \"acc_norm\": 0.6181818181818182,\n\
\ \"acc_norm_stderr\": 0.046534298079135075\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6408163265306123,\n \"acc_stderr\": 0.030713560455108493,\n\
\ \"acc_norm\": 0.6408163265306123,\n \"acc_norm_stderr\": 0.030713560455108493\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7213930348258707,\n\
\ \"acc_stderr\": 0.031700561834973086,\n \"acc_norm\": 0.7213930348258707,\n\
\ \"acc_norm_stderr\": 0.031700561834973086\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.40963855421686746,\n\
\ \"acc_stderr\": 0.03828401115079023,\n \"acc_norm\": 0.40963855421686746,\n\
\ \"acc_norm_stderr\": 0.03828401115079023\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7602339181286549,\n \"acc_stderr\": 0.03274485211946956,\n\
\ \"acc_norm\": 0.7602339181286549,\n \"acc_norm_stderr\": 0.03274485211946956\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3708690330477356,\n\
\ \"mc1_stderr\": 0.016909693580248825,\n \"mc2\": 0.5333434257017081,\n\
\ \"mc2_stderr\": 0.015907207649223338\n }\n}\n```"
repo_url: https://huggingface.co/Undi95/MM-ReMM-L2-20B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_03T17_39_23.702108
path:
- '**/details_harness|arc:challenge|25_2023-10-03T17-39-23.702108.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-03T17-39-23.702108.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_03T17_39_23.702108
path:
- '**/details_harness|hellaswag|10_2023-10-03T17-39-23.702108.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-03T17-39-23.702108.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_03T17_39_23.702108
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-39-23.702108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-39-23.702108.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-39-23.702108.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_03T17_39_23.702108
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-39-23.702108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-39-23.702108.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_03T17_39_23.702108
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-39-23.702108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-39-23.702108.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_03T17_39_23.702108
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-39-23.702108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-39-23.702108.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_03T17_39_23.702108
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-39-23.702108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-39-23.702108.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_03T17_39_23.702108
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-39-23.702108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-39-23.702108.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_03T17_39_23.702108
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-39-23.702108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-39-23.702108.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_03T17_39_23.702108
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-39-23.702108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-39-23.702108.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_03T17_39_23.702108
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-39-23.702108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-39-23.702108.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_03T17_39_23.702108
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-39-23.702108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-39-23.702108.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_03T17_39_23.702108
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-39-23.702108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-39-23.702108.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_03T17_39_23.702108
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-39-23.702108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-39-23.702108.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_03T17_39_23.702108
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-39-23.702108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-39-23.702108.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_03T17_39_23.702108
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-39-23.702108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-39-23.702108.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_03T17_39_23.702108
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-39-23.702108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-39-23.702108.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_03T17_39_23.702108
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-39-23.702108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-39-23.702108.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_03T17_39_23.702108
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-39-23.702108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-39-23.702108.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_03T17_39_23.702108
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-39-23.702108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-39-23.702108.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_03T17_39_23.702108
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-39-23.702108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-39-23.702108.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_03T17_39_23.702108
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-39-23.702108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-39-23.702108.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_03T17_39_23.702108
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-39-23.702108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-39-23.702108.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_03T17_39_23.702108
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-39-23.702108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-39-23.702108.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_03T17_39_23.702108
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-39-23.702108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-39-23.702108.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_03T17_39_23.702108
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-39-23.702108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-39-23.702108.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_03T17_39_23.702108
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-39-23.702108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-39-23.702108.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_03T17_39_23.702108
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-39-23.702108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-39-23.702108.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_03T17_39_23.702108
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-39-23.702108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-39-23.702108.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_03T17_39_23.702108
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-39-23.702108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-39-23.702108.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_03T17_39_23.702108
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-39-23.702108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-39-23.702108.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_03T17_39_23.702108
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-39-23.702108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-39-23.702108.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_03T17_39_23.702108
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-39-23.702108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-39-23.702108.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_03T17_39_23.702108
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-39-23.702108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-39-23.702108.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_03T17_39_23.702108
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-39-23.702108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-39-23.702108.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_03T17_39_23.702108
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-39-23.702108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-39-23.702108.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_03T17_39_23.702108
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-39-23.702108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-39-23.702108.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_03T17_39_23.702108
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-39-23.702108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-39-23.702108.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_03T17_39_23.702108
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-39-23.702108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-39-23.702108.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_03T17_39_23.702108
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-39-23.702108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-39-23.702108.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_03T17_39_23.702108
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-39-23.702108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-39-23.702108.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_03T17_39_23.702108
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T17-39-23.702108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T17-39-23.702108.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_03T17_39_23.702108
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-39-23.702108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-39-23.702108.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_03T17_39_23.702108
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-39-23.702108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-39-23.702108.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_03T17_39_23.702108
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-39-23.702108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-39-23.702108.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_03T17_39_23.702108
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-39-23.702108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-39-23.702108.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_03T17_39_23.702108
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-39-23.702108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-39-23.702108.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_03T17_39_23.702108
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-39-23.702108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-39-23.702108.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_03T17_39_23.702108
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-39-23.702108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-39-23.702108.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_03T17_39_23.702108
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-39-23.702108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-39-23.702108.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_03T17_39_23.702108
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-39-23.702108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-39-23.702108.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_03T17_39_23.702108
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-39-23.702108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-39-23.702108.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_03T17_39_23.702108
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-39-23.702108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-39-23.702108.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_03T17_39_23.702108
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-39-23.702108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-39-23.702108.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_03T17_39_23.702108
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-39-23.702108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-39-23.702108.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_03T17_39_23.702108
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-39-23.702108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-39-23.702108.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_03T17_39_23.702108
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-39-23.702108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-39-23.702108.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_03T17_39_23.702108
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-39-23.702108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-39-23.702108.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_03T17_39_23.702108
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-39-23.702108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-39-23.702108.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_03T17_39_23.702108
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-39-23.702108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-39-23.702108.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_03T17_39_23.702108
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T17-39-23.702108.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T17-39-23.702108.parquet'
- config_name: results
data_files:
- split: 2023_10_03T17_39_23.702108
path:
- results_2023-10-03T17-39-23.702108.parquet
- split: latest
path:
- results_2023-10-03T17-39-23.702108.parquet
---
# Dataset Card for Evaluation run of Undi95/MM-ReMM-L2-20B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Undi95/MM-ReMM-L2-20B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Undi95/MM-ReMM-L2-20B](https://huggingface.co/Undi95/MM-ReMM-L2-20B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Undi95__MM-ReMM-L2-20B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-03T17:39:23.702108](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__MM-ReMM-L2-20B/blob/main/results_2023-10-03T17-39-23.702108.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5665465459002151,
"acc_stderr": 0.034322884462850756,
"acc_norm": 0.5701528202686406,
"acc_norm_stderr": 0.034300794934459776,
"mc1": 0.3708690330477356,
"mc1_stderr": 0.016909693580248825,
"mc2": 0.5333434257017081,
"mc2_stderr": 0.015907207649223338
},
"harness|arc:challenge|25": {
"acc": 0.5878839590443686,
"acc_stderr": 0.014383915302225403,
"acc_norm": 0.6083617747440273,
"acc_norm_stderr": 0.014264122124938215
},
"harness|hellaswag|10": {
"acc": 0.6595299741087433,
"acc_stderr": 0.004728988167338544,
"acc_norm": 0.851822346146186,
"acc_norm_stderr": 0.0035454991695580518
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411022,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411022
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4888888888888889,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.4888888888888889,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5460526315789473,
"acc_stderr": 0.04051646342874142,
"acc_norm": 0.5460526315789473,
"acc_norm_stderr": 0.04051646342874142
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5886792452830188,
"acc_stderr": 0.03028500925900979,
"acc_norm": 0.5886792452830188,
"acc_norm_stderr": 0.03028500925900979
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6458333333333334,
"acc_stderr": 0.039994111357535424,
"acc_norm": 0.6458333333333334,
"acc_norm_stderr": 0.039994111357535424
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.45,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5144508670520231,
"acc_stderr": 0.03810871630454764,
"acc_norm": 0.5144508670520231,
"acc_norm_stderr": 0.03810871630454764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04690650298201942,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04690650298201942
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526094,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526094
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4723404255319149,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.4723404255319149,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.22807017543859648,
"acc_stderr": 0.03947152782669415,
"acc_norm": 0.22807017543859648,
"acc_norm_stderr": 0.03947152782669415
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.02397386199899208,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.02397386199899208
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.042163702135578345,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.042163702135578345
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6580645161290323,
"acc_stderr": 0.02698528957655274,
"acc_norm": 0.6580645161290323,
"acc_norm_stderr": 0.02698528957655274
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.03481904844438804,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.03481904844438804
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.036085410115739666,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.036085410115739666
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.03173071239071724,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.03173071239071724
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8134715025906736,
"acc_stderr": 0.028112091210117478,
"acc_norm": 0.8134715025906736,
"acc_norm_stderr": 0.028112091210117478
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.558974358974359,
"acc_stderr": 0.025174048384000745,
"acc_norm": 0.558974358974359,
"acc_norm_stderr": 0.025174048384000745
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028604,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028604
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6218487394957983,
"acc_stderr": 0.031499305777849054,
"acc_norm": 0.6218487394957983,
"acc_norm_stderr": 0.031499305777849054
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7302752293577982,
"acc_stderr": 0.01902848671111544,
"acc_norm": 0.7302752293577982,
"acc_norm_stderr": 0.01902848671111544
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.03400603625538271,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.03400603625538271
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7843137254901961,
"acc_stderr": 0.028867431449849313,
"acc_norm": 0.7843137254901961,
"acc_norm_stderr": 0.028867431449849313
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7468354430379747,
"acc_stderr": 0.0283046579430353,
"acc_norm": 0.7468354430379747,
"acc_norm_stderr": 0.0283046579430353
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6636771300448431,
"acc_stderr": 0.031708824268455,
"acc_norm": 0.6636771300448431,
"acc_norm_stderr": 0.031708824268455
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.648854961832061,
"acc_stderr": 0.04186445163013751,
"acc_norm": 0.648854961832061,
"acc_norm_stderr": 0.04186445163013751
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.040261875275912073,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.040261875275912073
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.04453197507374983,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.04453197507374983
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6748466257668712,
"acc_stderr": 0.036803503712864616,
"acc_norm": 0.6748466257668712,
"acc_norm_stderr": 0.036803503712864616
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.30357142857142855,
"acc_stderr": 0.04364226155841044,
"acc_norm": 0.30357142857142855,
"acc_norm_stderr": 0.04364226155841044
},
"harness|hendrycksTest-management|5": {
"acc": 0.6601941747572816,
"acc_stderr": 0.046897659372781335,
"acc_norm": 0.6601941747572816,
"acc_norm_stderr": 0.046897659372781335
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8290598290598291,
"acc_stderr": 0.024662496845209814,
"acc_norm": 0.8290598290598291,
"acc_norm_stderr": 0.024662496845209814
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7535121328224776,
"acc_stderr": 0.015411308769686933,
"acc_norm": 0.7535121328224776,
"acc_norm_stderr": 0.015411308769686933
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6560693641618497,
"acc_stderr": 0.02557412378654667,
"acc_norm": 0.6560693641618497,
"acc_norm_stderr": 0.02557412378654667
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3776536312849162,
"acc_stderr": 0.016214148752136632,
"acc_norm": 0.3776536312849162,
"acc_norm_stderr": 0.016214148752136632
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.02736359328468496,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.02736359328468496
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6688102893890675,
"acc_stderr": 0.02673062072800491,
"acc_norm": 0.6688102893890675,
"acc_norm_stderr": 0.02673062072800491
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6790123456790124,
"acc_stderr": 0.02597656601086274,
"acc_norm": 0.6790123456790124,
"acc_norm_stderr": 0.02597656601086274
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4397163120567376,
"acc_stderr": 0.029609912075594106,
"acc_norm": 0.4397163120567376,
"acc_norm_stderr": 0.029609912075594106
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45697522816166886,
"acc_stderr": 0.012722869501611419,
"acc_norm": 0.45697522816166886,
"acc_norm_stderr": 0.012722869501611419
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5551470588235294,
"acc_stderr": 0.030187532060329383,
"acc_norm": 0.5551470588235294,
"acc_norm_stderr": 0.030187532060329383
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5882352941176471,
"acc_stderr": 0.019910377463105932,
"acc_norm": 0.5882352941176471,
"acc_norm_stderr": 0.019910377463105932
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.046534298079135075,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.046534298079135075
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6408163265306123,
"acc_stderr": 0.030713560455108493,
"acc_norm": 0.6408163265306123,
"acc_norm_stderr": 0.030713560455108493
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7213930348258707,
"acc_stderr": 0.031700561834973086,
"acc_norm": 0.7213930348258707,
"acc_norm_stderr": 0.031700561834973086
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.40963855421686746,
"acc_stderr": 0.03828401115079023,
"acc_norm": 0.40963855421686746,
"acc_norm_stderr": 0.03828401115079023
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7602339181286549,
"acc_stderr": 0.03274485211946956,
"acc_norm": 0.7602339181286549,
"acc_norm_stderr": 0.03274485211946956
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3708690330477356,
"mc1_stderr": 0.016909693580248825,
"mc2": 0.5333434257017081,
"mc2_stderr": 0.015907207649223338
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
shellypeng/refined-anime-images-ds | 2023-10-03T17:40:17.000Z | [
"region:us"
] | shellypeng | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_PygmalionAI__pygmalion-2-13b | 2023-10-03T17:50:45.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of PygmalionAI/pygmalion-2-13b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [PygmalionAI/pygmalion-2-13b](https://huggingface.co/PygmalionAI/pygmalion-2-13b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PygmalionAI__pygmalion-2-13b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-03T17:49:20.721820](https://huggingface.co/datasets/open-llm-leaderboard/details_PygmalionAI__pygmalion-2-13b/blob/main/results_2023-10-03T17-49-20.721820.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5612445914530442,\n\
\ \"acc_stderr\": 0.034411456369991295,\n \"acc_norm\": 0.5653937727472228,\n\
\ \"acc_norm_stderr\": 0.03439054119044284,\n \"mc1\": 0.27906976744186046,\n\
\ \"mc1_stderr\": 0.015702107090627904,\n \"mc2\": 0.4222097304728649,\n\
\ \"mc2_stderr\": 0.014406750015914481\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.568259385665529,\n \"acc_stderr\": 0.014474591427196206,\n\
\ \"acc_norm\": 0.6032423208191127,\n \"acc_norm_stderr\": 0.014296513020180646\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6139215295757817,\n\
\ \"acc_stderr\": 0.004858539527872461,\n \"acc_norm\": 0.8237402907787293,\n\
\ \"acc_norm_stderr\": 0.0038026223415290133\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5328947368421053,\n \"acc_stderr\": 0.040601270352363966,\n\
\ \"acc_norm\": 0.5328947368421053,\n \"acc_norm_stderr\": 0.040601270352363966\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6415094339622641,\n \"acc_stderr\": 0.029514703583981762,\n\
\ \"acc_norm\": 0.6415094339622641,\n \"acc_norm_stderr\": 0.029514703583981762\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6041666666666666,\n\
\ \"acc_stderr\": 0.04089465449325582,\n \"acc_norm\": 0.6041666666666666,\n\
\ \"acc_norm_stderr\": 0.04089465449325582\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5433526011560693,\n\
\ \"acc_stderr\": 0.03798106566014498,\n \"acc_norm\": 0.5433526011560693,\n\
\ \"acc_norm_stderr\": 0.03798106566014498\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.04440521906179327,\n\
\ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.04440521906179327\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4127659574468085,\n \"acc_stderr\": 0.03218471141400351,\n\
\ \"acc_norm\": 0.4127659574468085,\n \"acc_norm_stderr\": 0.03218471141400351\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3157894736842105,\n\
\ \"acc_stderr\": 0.043727482902780064,\n \"acc_norm\": 0.3157894736842105,\n\
\ \"acc_norm_stderr\": 0.043727482902780064\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3306878306878307,\n \"acc_stderr\": 0.024229965298425072,\n \"\
acc_norm\": 0.3306878306878307,\n \"acc_norm_stderr\": 0.024229965298425072\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n\
\ \"acc_stderr\": 0.043435254289490965,\n \"acc_norm\": 0.38095238095238093,\n\
\ \"acc_norm_stderr\": 0.043435254289490965\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6516129032258065,\n \"acc_stderr\": 0.027104826328100944,\n \"\
acc_norm\": 0.6516129032258065,\n \"acc_norm_stderr\": 0.027104826328100944\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.47783251231527096,\n \"acc_stderr\": 0.03514528562175008,\n \"\
acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.03514528562175008\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6787878787878788,\n \"acc_stderr\": 0.0364620496325381,\n\
\ \"acc_norm\": 0.6787878787878788,\n \"acc_norm_stderr\": 0.0364620496325381\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7070707070707071,\n \"acc_stderr\": 0.032424979581788166,\n \"\
acc_norm\": 0.7070707070707071,\n \"acc_norm_stderr\": 0.032424979581788166\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8031088082901554,\n \"acc_stderr\": 0.028697873971860688,\n\
\ \"acc_norm\": 0.8031088082901554,\n \"acc_norm_stderr\": 0.028697873971860688\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5153846153846153,\n \"acc_stderr\": 0.025339003010106515,\n\
\ \"acc_norm\": 0.5153846153846153,\n \"acc_norm_stderr\": 0.025339003010106515\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.27037037037037037,\n \"acc_stderr\": 0.02708037281514565,\n \
\ \"acc_norm\": 0.27037037037037037,\n \"acc_norm_stderr\": 0.02708037281514565\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.0322529423239964,\n \
\ \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.0322529423239964\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7504587155963303,\n \"acc_stderr\": 0.018553897629501628,\n \"\
acc_norm\": 0.7504587155963303,\n \"acc_norm_stderr\": 0.018553897629501628\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4027777777777778,\n \"acc_stderr\": 0.033448873829978666,\n \"\
acc_norm\": 0.4027777777777778,\n \"acc_norm_stderr\": 0.033448873829978666\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.75,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.759493670886076,\n \"acc_stderr\": 0.027820781981149685,\n\
\ \"acc_norm\": 0.759493670886076,\n \"acc_norm_stderr\": 0.027820781981149685\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6547085201793722,\n\
\ \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.6547085201793722,\n\
\ \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6259541984732825,\n \"acc_stderr\": 0.042438692422305246,\n\
\ \"acc_norm\": 0.6259541984732825,\n \"acc_norm_stderr\": 0.042438692422305246\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\"\
: 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7129629629629629,\n\
\ \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.7129629629629629,\n\
\ \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6993865030674846,\n \"acc_stderr\": 0.03602511318806771,\n\
\ \"acc_norm\": 0.6993865030674846,\n \"acc_norm_stderr\": 0.03602511318806771\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3392857142857143,\n\
\ \"acc_stderr\": 0.04493949068613539,\n \"acc_norm\": 0.3392857142857143,\n\
\ \"acc_norm_stderr\": 0.04493949068613539\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7087378640776699,\n \"acc_stderr\": 0.044986763205729224,\n\
\ \"acc_norm\": 0.7087378640776699,\n \"acc_norm_stderr\": 0.044986763205729224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.024414947304543678,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.024414947304543678\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7637292464878672,\n\
\ \"acc_stderr\": 0.01519047371703751,\n \"acc_norm\": 0.7637292464878672,\n\
\ \"acc_norm_stderr\": 0.01519047371703751\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.638728323699422,\n \"acc_stderr\": 0.025862201852277895,\n\
\ \"acc_norm\": 0.638728323699422,\n \"acc_norm_stderr\": 0.025862201852277895\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4122905027932961,\n\
\ \"acc_stderr\": 0.01646320023811452,\n \"acc_norm\": 0.4122905027932961,\n\
\ \"acc_norm_stderr\": 0.01646320023811452\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6274509803921569,\n \"acc_stderr\": 0.027684181883302898,\n\
\ \"acc_norm\": 0.6274509803921569,\n \"acc_norm_stderr\": 0.027684181883302898\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6463022508038585,\n\
\ \"acc_stderr\": 0.02715520810320086,\n \"acc_norm\": 0.6463022508038585,\n\
\ \"acc_norm_stderr\": 0.02715520810320086\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6358024691358025,\n \"acc_stderr\": 0.026774929899722327,\n\
\ \"acc_norm\": 0.6358024691358025,\n \"acc_norm_stderr\": 0.026774929899722327\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3900709219858156,\n \"acc_stderr\": 0.029097675599463926,\n \
\ \"acc_norm\": 0.3900709219858156,\n \"acc_norm_stderr\": 0.029097675599463926\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41199478487614083,\n\
\ \"acc_stderr\": 0.01257087103214607,\n \"acc_norm\": 0.41199478487614083,\n\
\ \"acc_norm_stderr\": 0.01257087103214607\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5514705882352942,\n \"acc_stderr\": 0.030211479609121596,\n\
\ \"acc_norm\": 0.5514705882352942,\n \"acc_norm_stderr\": 0.030211479609121596\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.565359477124183,\n \"acc_stderr\": 0.02005426920072646,\n \
\ \"acc_norm\": 0.565359477124183,\n \"acc_norm_stderr\": 0.02005426920072646\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n\
\ \"acc_stderr\": 0.046534298079135075,\n \"acc_norm\": 0.6181818181818182,\n\
\ \"acc_norm_stderr\": 0.046534298079135075\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6408163265306123,\n \"acc_stderr\": 0.030713560455108493,\n\
\ \"acc_norm\": 0.6408163265306123,\n \"acc_norm_stderr\": 0.030713560455108493\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.736318407960199,\n\
\ \"acc_stderr\": 0.031157150869355558,\n \"acc_norm\": 0.736318407960199,\n\
\ \"acc_norm_stderr\": 0.031157150869355558\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366255,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366255\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4578313253012048,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.4578313253012048,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03188578017686398,\n\
\ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03188578017686398\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.27906976744186046,\n\
\ \"mc1_stderr\": 0.015702107090627904,\n \"mc2\": 0.4222097304728649,\n\
\ \"mc2_stderr\": 0.014406750015914481\n }\n}\n```"
repo_url: https://huggingface.co/PygmalionAI/pygmalion-2-13b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_03T17_49_20.721820
path:
- '**/details_harness|arc:challenge|25_2023-10-03T17-49-20.721820.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-03T17-49-20.721820.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_03T17_49_20.721820
path:
- '**/details_harness|hellaswag|10_2023-10-03T17-49-20.721820.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-03T17-49-20.721820.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_03T17_49_20.721820
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-49-20.721820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-49-20.721820.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-49-20.721820.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_03T17_49_20.721820
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-49-20.721820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-49-20.721820.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_03T17_49_20.721820
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-49-20.721820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-49-20.721820.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_03T17_49_20.721820
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-49-20.721820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-49-20.721820.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_03T17_49_20.721820
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-49-20.721820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-49-20.721820.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_03T17_49_20.721820
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-49-20.721820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-49-20.721820.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_03T17_49_20.721820
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-49-20.721820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-49-20.721820.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_03T17_49_20.721820
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-49-20.721820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-49-20.721820.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_03T17_49_20.721820
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-49-20.721820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-49-20.721820.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_03T17_49_20.721820
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-49-20.721820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-49-20.721820.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_03T17_49_20.721820
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-49-20.721820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-49-20.721820.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_03T17_49_20.721820
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-49-20.721820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-49-20.721820.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_03T17_49_20.721820
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-49-20.721820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-49-20.721820.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_03T17_49_20.721820
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-49-20.721820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-49-20.721820.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_03T17_49_20.721820
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-49-20.721820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-49-20.721820.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_03T17_49_20.721820
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-49-20.721820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-49-20.721820.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_03T17_49_20.721820
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-49-20.721820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-49-20.721820.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_03T17_49_20.721820
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-49-20.721820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-49-20.721820.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_03T17_49_20.721820
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-49-20.721820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-49-20.721820.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_03T17_49_20.721820
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-49-20.721820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-49-20.721820.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_03T17_49_20.721820
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-49-20.721820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-49-20.721820.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_03T17_49_20.721820
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-49-20.721820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-49-20.721820.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_03T17_49_20.721820
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-49-20.721820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-49-20.721820.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_03T17_49_20.721820
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-49-20.721820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-49-20.721820.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_03T17_49_20.721820
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-49-20.721820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-49-20.721820.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_03T17_49_20.721820
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-49-20.721820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-49-20.721820.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_03T17_49_20.721820
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-49-20.721820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-49-20.721820.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_03T17_49_20.721820
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-49-20.721820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-49-20.721820.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_03T17_49_20.721820
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-49-20.721820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-49-20.721820.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_03T17_49_20.721820
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-49-20.721820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-49-20.721820.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_03T17_49_20.721820
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-49-20.721820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-49-20.721820.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_03T17_49_20.721820
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-49-20.721820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-49-20.721820.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_03T17_49_20.721820
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-49-20.721820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-49-20.721820.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_03T17_49_20.721820
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-49-20.721820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-49-20.721820.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_03T17_49_20.721820
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-49-20.721820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-49-20.721820.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_03T17_49_20.721820
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-49-20.721820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-49-20.721820.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_03T17_49_20.721820
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-49-20.721820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-49-20.721820.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_03T17_49_20.721820
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-49-20.721820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-49-20.721820.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_03T17_49_20.721820
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-49-20.721820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-49-20.721820.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_03T17_49_20.721820
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T17-49-20.721820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T17-49-20.721820.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_03T17_49_20.721820
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-49-20.721820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-49-20.721820.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_03T17_49_20.721820
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-49-20.721820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-49-20.721820.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_03T17_49_20.721820
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-49-20.721820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-49-20.721820.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_03T17_49_20.721820
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-49-20.721820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-49-20.721820.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_03T17_49_20.721820
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-49-20.721820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-49-20.721820.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_03T17_49_20.721820
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-49-20.721820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-49-20.721820.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_03T17_49_20.721820
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-49-20.721820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-49-20.721820.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_03T17_49_20.721820
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-49-20.721820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-49-20.721820.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_03T17_49_20.721820
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-49-20.721820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-49-20.721820.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_03T17_49_20.721820
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-49-20.721820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-49-20.721820.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_03T17_49_20.721820
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-49-20.721820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-49-20.721820.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_03T17_49_20.721820
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-49-20.721820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-49-20.721820.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_03T17_49_20.721820
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-49-20.721820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-49-20.721820.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_03T17_49_20.721820
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-49-20.721820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-49-20.721820.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_03T17_49_20.721820
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-49-20.721820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-49-20.721820.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_03T17_49_20.721820
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-49-20.721820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-49-20.721820.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_03T17_49_20.721820
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-49-20.721820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-49-20.721820.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_03T17_49_20.721820
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-49-20.721820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-49-20.721820.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_03T17_49_20.721820
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T17-49-20.721820.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T17-49-20.721820.parquet'
- config_name: results
data_files:
- split: 2023_10_03T17_49_20.721820
path:
- results_2023-10-03T17-49-20.721820.parquet
- split: latest
path:
- results_2023-10-03T17-49-20.721820.parquet
---
# Dataset Card for Evaluation run of PygmalionAI/pygmalion-2-13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/PygmalionAI/pygmalion-2-13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [PygmalionAI/pygmalion-2-13b](https://huggingface.co/PygmalionAI/pygmalion-2-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_PygmalionAI__pygmalion-2-13b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-03T17:49:20.721820](https://huggingface.co/datasets/open-llm-leaderboard/details_PygmalionAI__pygmalion-2-13b/blob/main/results_2023-10-03T17-49-20.721820.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5612445914530442,
"acc_stderr": 0.034411456369991295,
"acc_norm": 0.5653937727472228,
"acc_norm_stderr": 0.03439054119044284,
"mc1": 0.27906976744186046,
"mc1_stderr": 0.015702107090627904,
"mc2": 0.4222097304728649,
"mc2_stderr": 0.014406750015914481
},
"harness|arc:challenge|25": {
"acc": 0.568259385665529,
"acc_stderr": 0.014474591427196206,
"acc_norm": 0.6032423208191127,
"acc_norm_stderr": 0.014296513020180646
},
"harness|hellaswag|10": {
"acc": 0.6139215295757817,
"acc_stderr": 0.004858539527872461,
"acc_norm": 0.8237402907787293,
"acc_norm_stderr": 0.0038026223415290133
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5328947368421053,
"acc_stderr": 0.040601270352363966,
"acc_norm": 0.5328947368421053,
"acc_norm_stderr": 0.040601270352363966
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6415094339622641,
"acc_stderr": 0.029514703583981762,
"acc_norm": 0.6415094339622641,
"acc_norm_stderr": 0.029514703583981762
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6041666666666666,
"acc_stderr": 0.04089465449325582,
"acc_norm": 0.6041666666666666,
"acc_norm_stderr": 0.04089465449325582
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5433526011560693,
"acc_stderr": 0.03798106566014498,
"acc_norm": 0.5433526011560693,
"acc_norm_stderr": 0.03798106566014498
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.04440521906179327,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.04440521906179327
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4127659574468085,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.4127659574468085,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.043727482902780064,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.043727482902780064
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.04164188720169375,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.04164188720169375
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3306878306878307,
"acc_stderr": 0.024229965298425072,
"acc_norm": 0.3306878306878307,
"acc_norm_stderr": 0.024229965298425072
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.043435254289490965,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.043435254289490965
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6516129032258065,
"acc_stderr": 0.027104826328100944,
"acc_norm": 0.6516129032258065,
"acc_norm_stderr": 0.027104826328100944
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.47783251231527096,
"acc_stderr": 0.03514528562175008,
"acc_norm": 0.47783251231527096,
"acc_norm_stderr": 0.03514528562175008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6787878787878788,
"acc_stderr": 0.0364620496325381,
"acc_norm": 0.6787878787878788,
"acc_norm_stderr": 0.0364620496325381
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7070707070707071,
"acc_stderr": 0.032424979581788166,
"acc_norm": 0.7070707070707071,
"acc_norm_stderr": 0.032424979581788166
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8031088082901554,
"acc_stderr": 0.028697873971860688,
"acc_norm": 0.8031088082901554,
"acc_norm_stderr": 0.028697873971860688
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5153846153846153,
"acc_stderr": 0.025339003010106515,
"acc_norm": 0.5153846153846153,
"acc_norm_stderr": 0.025339003010106515
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.27037037037037037,
"acc_stderr": 0.02708037281514565,
"acc_norm": 0.27037037037037037,
"acc_norm_stderr": 0.02708037281514565
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.0322529423239964,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.0322529423239964
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7504587155963303,
"acc_stderr": 0.018553897629501628,
"acc_norm": 0.7504587155963303,
"acc_norm_stderr": 0.018553897629501628
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4027777777777778,
"acc_stderr": 0.033448873829978666,
"acc_norm": 0.4027777777777778,
"acc_norm_stderr": 0.033448873829978666
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.75,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.759493670886076,
"acc_stderr": 0.027820781981149685,
"acc_norm": 0.759493670886076,
"acc_norm_stderr": 0.027820781981149685
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6547085201793722,
"acc_stderr": 0.03191100192835794,
"acc_norm": 0.6547085201793722,
"acc_norm_stderr": 0.03191100192835794
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6259541984732825,
"acc_stderr": 0.042438692422305246,
"acc_norm": 0.6259541984732825,
"acc_norm_stderr": 0.042438692422305246
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302872,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302872
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6993865030674846,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.6993865030674846,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3392857142857143,
"acc_stderr": 0.04493949068613539,
"acc_norm": 0.3392857142857143,
"acc_norm_stderr": 0.04493949068613539
},
"harness|hendrycksTest-management|5": {
"acc": 0.7087378640776699,
"acc_stderr": 0.044986763205729224,
"acc_norm": 0.7087378640776699,
"acc_norm_stderr": 0.044986763205729224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.024414947304543678,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.024414947304543678
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7637292464878672,
"acc_stderr": 0.01519047371703751,
"acc_norm": 0.7637292464878672,
"acc_norm_stderr": 0.01519047371703751
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.638728323699422,
"acc_stderr": 0.025862201852277895,
"acc_norm": 0.638728323699422,
"acc_norm_stderr": 0.025862201852277895
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4122905027932961,
"acc_stderr": 0.01646320023811452,
"acc_norm": 0.4122905027932961,
"acc_norm_stderr": 0.01646320023811452
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6274509803921569,
"acc_stderr": 0.027684181883302898,
"acc_norm": 0.6274509803921569,
"acc_norm_stderr": 0.027684181883302898
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6463022508038585,
"acc_stderr": 0.02715520810320086,
"acc_norm": 0.6463022508038585,
"acc_norm_stderr": 0.02715520810320086
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6358024691358025,
"acc_stderr": 0.026774929899722327,
"acc_norm": 0.6358024691358025,
"acc_norm_stderr": 0.026774929899722327
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3900709219858156,
"acc_stderr": 0.029097675599463926,
"acc_norm": 0.3900709219858156,
"acc_norm_stderr": 0.029097675599463926
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.41199478487614083,
"acc_stderr": 0.01257087103214607,
"acc_norm": 0.41199478487614083,
"acc_norm_stderr": 0.01257087103214607
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5514705882352942,
"acc_stderr": 0.030211479609121596,
"acc_norm": 0.5514705882352942,
"acc_norm_stderr": 0.030211479609121596
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.565359477124183,
"acc_stderr": 0.02005426920072646,
"acc_norm": 0.565359477124183,
"acc_norm_stderr": 0.02005426920072646
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.046534298079135075,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.046534298079135075
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6408163265306123,
"acc_stderr": 0.030713560455108493,
"acc_norm": 0.6408163265306123,
"acc_norm_stderr": 0.030713560455108493
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.736318407960199,
"acc_stderr": 0.031157150869355558,
"acc_norm": 0.736318407960199,
"acc_norm_stderr": 0.031157150869355558
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366255,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366255
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4578313253012048,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.4578313253012048,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03188578017686398,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03188578017686398
},
"harness|truthfulqa:mc|0": {
"mc1": 0.27906976744186046,
"mc1_stderr": 0.015702107090627904,
"mc2": 0.4222097304728649,
"mc2_stderr": 0.014406750015914481
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
thangved/zitwaste | 2023-10-03T17:52:00.000Z | [
"license:openrail",
"region:us"
] | thangved | null | null | null | 0 | 0 | ---
license: openrail
---
|
open-llm-leaderboard/details_Dampish__StellarX-4B-V0 | 2023-10-03T17:58:18.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Dampish/StellarX-4B-V0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Dampish/StellarX-4B-V0](https://huggingface.co/Dampish/StellarX-4B-V0) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Dampish__StellarX-4B-V0\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-03T17:57:03.227360](https://huggingface.co/datasets/open-llm-leaderboard/details_Dampish__StellarX-4B-V0/blob/main/results_2023-10-03T17-57-03.227360.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.27266383036075503,\n\
\ \"acc_stderr\": 0.03224051042838464,\n \"acc_norm\": 0.27616554014040245,\n\
\ \"acc_norm_stderr\": 0.03224574795269476,\n \"mc1\": 0.20685434516523868,\n\
\ \"mc1_stderr\": 0.014179591496728343,\n \"mc2\": 0.34296822571733665,\n\
\ \"mc2_stderr\": 0.013628027163865984\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.32337883959044367,\n \"acc_stderr\": 0.013669421630012125,\n\
\ \"acc_norm\": 0.36945392491467577,\n \"acc_norm_stderr\": 0.0141045783664919\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4584744074885481,\n\
\ \"acc_stderr\": 0.004972543127767877,\n \"acc_norm\": 0.6190001991635132,\n\
\ \"acc_norm_stderr\": 0.004846400325585233\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2518518518518518,\n\
\ \"acc_stderr\": 0.03749850709174022,\n \"acc_norm\": 0.2518518518518518,\n\
\ \"acc_norm_stderr\": 0.03749850709174022\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.34210526315789475,\n \"acc_stderr\": 0.03860731599316092,\n\
\ \"acc_norm\": 0.34210526315789475,\n \"acc_norm_stderr\": 0.03860731599316092\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.23,\n\
\ \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2830188679245283,\n \"acc_stderr\": 0.0277242364927009,\n\
\ \"acc_norm\": 0.2830188679245283,\n \"acc_norm_stderr\": 0.0277242364927009\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.24305555555555555,\n\
\ \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.24305555555555555,\n\
\ \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.33,\n\
\ \"acc_stderr\": 0.04725815626252606,\n \"acc_norm\": 0.33,\n \
\ \"acc_norm_stderr\": 0.04725815626252606\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.26011560693641617,\n\
\ \"acc_stderr\": 0.03345036916788991,\n \"acc_norm\": 0.26011560693641617,\n\
\ \"acc_norm_stderr\": 0.03345036916788991\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617747,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617747\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2851063829787234,\n \"acc_stderr\": 0.02951319662553935,\n\
\ \"acc_norm\": 0.2851063829787234,\n \"acc_norm_stderr\": 0.02951319662553935\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n\
\ \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n\
\ \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.296551724137931,\n \"acc_stderr\": 0.03806142687309993,\n\
\ \"acc_norm\": 0.296551724137931,\n \"acc_norm_stderr\": 0.03806142687309993\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.26455026455026454,\n \"acc_stderr\": 0.022717467897708617,\n \"\
acc_norm\": 0.26455026455026454,\n \"acc_norm_stderr\": 0.022717467897708617\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.15079365079365079,\n\
\ \"acc_stderr\": 0.03200686497287394,\n \"acc_norm\": 0.15079365079365079,\n\
\ \"acc_norm_stderr\": 0.03200686497287394\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.22903225806451613,\n \"acc_stderr\": 0.02390491431178265,\n \"\
acc_norm\": 0.22903225806451613,\n \"acc_norm_stderr\": 0.02390491431178265\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.2512315270935961,\n \"acc_stderr\": 0.030516530732694433,\n \"\
acc_norm\": 0.2512315270935961,\n \"acc_norm_stderr\": 0.030516530732694433\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909282,\n \"acc_norm\"\
: 0.24,\n \"acc_norm_stderr\": 0.04292346959909282\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.23030303030303031,\n \"acc_stderr\": 0.03287666758603489,\n\
\ \"acc_norm\": 0.23030303030303031,\n \"acc_norm_stderr\": 0.03287666758603489\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.35353535353535354,\n \"acc_stderr\": 0.03406086723547153,\n \"\
acc_norm\": 0.35353535353535354,\n \"acc_norm_stderr\": 0.03406086723547153\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.26424870466321243,\n \"acc_stderr\": 0.031821550509166484,\n\
\ \"acc_norm\": 0.26424870466321243,\n \"acc_norm_stderr\": 0.031821550509166484\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2512820512820513,\n \"acc_stderr\": 0.02199201666237056,\n \
\ \"acc_norm\": 0.2512820512820513,\n \"acc_norm_stderr\": 0.02199201666237056\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2222222222222222,\n \"acc_stderr\": 0.025348097468097838,\n \
\ \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.025348097468097838\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.19327731092436976,\n \"acc_stderr\": 0.0256494702658892,\n \
\ \"acc_norm\": 0.19327731092436976,\n \"acc_norm_stderr\": 0.0256494702658892\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.271523178807947,\n \"acc_stderr\": 0.03631329803969653,\n \"acc_norm\"\
: 0.271523178807947,\n \"acc_norm_stderr\": 0.03631329803969653\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.3284403669724771,\n\
\ \"acc_stderr\": 0.020135902797298395,\n \"acc_norm\": 0.3284403669724771,\n\
\ \"acc_norm_stderr\": 0.020135902797298395\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.3287037037037037,\n \"acc_stderr\": 0.032036140846700596,\n\
\ \"acc_norm\": 0.3287037037037037,\n \"acc_norm_stderr\": 0.032036140846700596\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.24019607843137256,\n \"acc_stderr\": 0.02998373305591362,\n \"\
acc_norm\": 0.24019607843137256,\n \"acc_norm_stderr\": 0.02998373305591362\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n \
\ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.11659192825112108,\n\
\ \"acc_stderr\": 0.02153963981624447,\n \"acc_norm\": 0.11659192825112108,\n\
\ \"acc_norm_stderr\": 0.02153963981624447\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.24427480916030533,\n \"acc_stderr\": 0.03768335959728744,\n\
\ \"acc_norm\": 0.24427480916030533,\n \"acc_norm_stderr\": 0.03768335959728744\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.4049586776859504,\n \"acc_stderr\": 0.044811377559424694,\n \"\
acc_norm\": 0.4049586776859504,\n \"acc_norm_stderr\": 0.044811377559424694\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.24539877300613497,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.24539877300613497,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2767857142857143,\n\
\ \"acc_stderr\": 0.04246624336697625,\n \"acc_norm\": 0.2767857142857143,\n\
\ \"acc_norm_stderr\": 0.04246624336697625\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.30097087378640774,\n \"acc_stderr\": 0.045416094465039476,\n\
\ \"acc_norm\": 0.30097087378640774,\n \"acc_norm_stderr\": 0.045416094465039476\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.29914529914529914,\n\
\ \"acc_stderr\": 0.029996951858349483,\n \"acc_norm\": 0.29914529914529914,\n\
\ \"acc_norm_stderr\": 0.029996951858349483\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.28735632183908044,\n\
\ \"acc_stderr\": 0.0161824107306827,\n \"acc_norm\": 0.28735632183908044,\n\
\ \"acc_norm_stderr\": 0.0161824107306827\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27262569832402234,\n\
\ \"acc_stderr\": 0.014893391735249588,\n \"acc_norm\": 0.27262569832402234,\n\
\ \"acc_norm_stderr\": 0.014893391735249588\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.02495418432487991,\n\
\ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.02495418432487991\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.28938906752411575,\n\
\ \"acc_stderr\": 0.02575586592263294,\n \"acc_norm\": 0.28938906752411575,\n\
\ \"acc_norm_stderr\": 0.02575586592263294\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2654320987654321,\n \"acc_stderr\": 0.024569223600460852,\n\
\ \"acc_norm\": 0.2654320987654321,\n \"acc_norm_stderr\": 0.024569223600460852\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2695035460992908,\n \"acc_stderr\": 0.026469036818590627,\n \
\ \"acc_norm\": 0.2695035460992908,\n \"acc_norm_stderr\": 0.026469036818590627\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2633637548891786,\n\
\ \"acc_stderr\": 0.011249506403605291,\n \"acc_norm\": 0.2633637548891786,\n\
\ \"acc_norm_stderr\": 0.011249506403605291\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.2426470588235294,\n \"acc_stderr\": 0.02604066247420126,\n\
\ \"acc_norm\": 0.2426470588235294,\n \"acc_norm_stderr\": 0.02604066247420126\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.24019607843137256,\n \"acc_stderr\": 0.017282760695167414,\n \
\ \"acc_norm\": 0.24019607843137256,\n \"acc_norm_stderr\": 0.017282760695167414\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.33636363636363636,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.33636363636363636,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.23673469387755103,\n \"acc_stderr\": 0.02721283588407314,\n\
\ \"acc_norm\": 0.23673469387755103,\n \"acc_norm_stderr\": 0.02721283588407314\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n\
\ \"acc_stderr\": 0.03036049015401468,\n \"acc_norm\": 0.24378109452736318,\n\
\ \"acc_norm_stderr\": 0.03036049015401468\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.23493975903614459,\n\
\ \"acc_stderr\": 0.03300533186128922,\n \"acc_norm\": 0.23493975903614459,\n\
\ \"acc_norm_stderr\": 0.03300533186128922\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.29239766081871343,\n \"acc_stderr\": 0.034886477134579215,\n\
\ \"acc_norm\": 0.29239766081871343,\n \"acc_norm_stderr\": 0.034886477134579215\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.20685434516523868,\n\
\ \"mc1_stderr\": 0.014179591496728343,\n \"mc2\": 0.34296822571733665,\n\
\ \"mc2_stderr\": 0.013628027163865984\n }\n}\n```"
repo_url: https://huggingface.co/Dampish/StellarX-4B-V0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_03T17_57_03.227360
path:
- '**/details_harness|arc:challenge|25_2023-10-03T17-57-03.227360.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-03T17-57-03.227360.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_03T17_57_03.227360
path:
- '**/details_harness|hellaswag|10_2023-10-03T17-57-03.227360.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-03T17-57-03.227360.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_03T17_57_03.227360
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-57-03.227360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-57-03.227360.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-57-03.227360.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_03T17_57_03.227360
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-57-03.227360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-57-03.227360.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_03T17_57_03.227360
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-57-03.227360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-57-03.227360.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_03T17_57_03.227360
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-57-03.227360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-57-03.227360.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_03T17_57_03.227360
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-57-03.227360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-57-03.227360.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_03T17_57_03.227360
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-57-03.227360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-57-03.227360.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_03T17_57_03.227360
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-57-03.227360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-57-03.227360.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_03T17_57_03.227360
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-57-03.227360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-57-03.227360.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_03T17_57_03.227360
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-57-03.227360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-57-03.227360.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_03T17_57_03.227360
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-57-03.227360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-57-03.227360.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_03T17_57_03.227360
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-57-03.227360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-57-03.227360.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_03T17_57_03.227360
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-57-03.227360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-57-03.227360.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_03T17_57_03.227360
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-57-03.227360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-57-03.227360.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_03T17_57_03.227360
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-57-03.227360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-57-03.227360.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_03T17_57_03.227360
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-57-03.227360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-57-03.227360.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_03T17_57_03.227360
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-57-03.227360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-57-03.227360.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_03T17_57_03.227360
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-57-03.227360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-57-03.227360.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_03T17_57_03.227360
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-57-03.227360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-57-03.227360.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_03T17_57_03.227360
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-57-03.227360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-57-03.227360.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_03T17_57_03.227360
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-57-03.227360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-57-03.227360.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_03T17_57_03.227360
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-57-03.227360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-57-03.227360.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_03T17_57_03.227360
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-57-03.227360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-57-03.227360.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_03T17_57_03.227360
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-57-03.227360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-57-03.227360.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_03T17_57_03.227360
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-57-03.227360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-57-03.227360.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_03T17_57_03.227360
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-57-03.227360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-57-03.227360.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_03T17_57_03.227360
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-57-03.227360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-57-03.227360.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_03T17_57_03.227360
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-57-03.227360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-57-03.227360.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_03T17_57_03.227360
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-57-03.227360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-57-03.227360.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_03T17_57_03.227360
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-57-03.227360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-57-03.227360.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_03T17_57_03.227360
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-57-03.227360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-57-03.227360.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_03T17_57_03.227360
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-57-03.227360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-57-03.227360.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_03T17_57_03.227360
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-57-03.227360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-57-03.227360.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_03T17_57_03.227360
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-57-03.227360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-57-03.227360.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_03T17_57_03.227360
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-57-03.227360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-57-03.227360.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_03T17_57_03.227360
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-57-03.227360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-57-03.227360.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_03T17_57_03.227360
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-57-03.227360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-57-03.227360.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_03T17_57_03.227360
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-57-03.227360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-57-03.227360.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_03T17_57_03.227360
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-57-03.227360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-57-03.227360.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_03T17_57_03.227360
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-57-03.227360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-57-03.227360.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_03T17_57_03.227360
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T17-57-03.227360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T17-57-03.227360.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_03T17_57_03.227360
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-57-03.227360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-57-03.227360.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_03T17_57_03.227360
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-57-03.227360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-57-03.227360.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_03T17_57_03.227360
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-57-03.227360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-57-03.227360.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_03T17_57_03.227360
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-57-03.227360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-57-03.227360.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_03T17_57_03.227360
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-57-03.227360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-57-03.227360.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_03T17_57_03.227360
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-57-03.227360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-57-03.227360.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_03T17_57_03.227360
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-57-03.227360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-57-03.227360.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_03T17_57_03.227360
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-57-03.227360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-57-03.227360.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_03T17_57_03.227360
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-57-03.227360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-57-03.227360.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_03T17_57_03.227360
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-57-03.227360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-57-03.227360.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_03T17_57_03.227360
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-57-03.227360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-57-03.227360.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_03T17_57_03.227360
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-57-03.227360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-57-03.227360.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_03T17_57_03.227360
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-57-03.227360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-57-03.227360.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_03T17_57_03.227360
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-57-03.227360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-57-03.227360.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_03T17_57_03.227360
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-57-03.227360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-57-03.227360.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_03T17_57_03.227360
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-57-03.227360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-57-03.227360.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_03T17_57_03.227360
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-57-03.227360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-57-03.227360.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_03T17_57_03.227360
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-57-03.227360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-57-03.227360.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_03T17_57_03.227360
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T17-57-03.227360.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T17-57-03.227360.parquet'
- config_name: results
data_files:
- split: 2023_10_03T17_57_03.227360
path:
- results_2023-10-03T17-57-03.227360.parquet
- split: latest
path:
- results_2023-10-03T17-57-03.227360.parquet
---
# Dataset Card for Evaluation run of Dampish/StellarX-4B-V0
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Dampish/StellarX-4B-V0
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Dampish/StellarX-4B-V0](https://huggingface.co/Dampish/StellarX-4B-V0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Dampish__StellarX-4B-V0",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-03T17:57:03.227360](https://huggingface.co/datasets/open-llm-leaderboard/details_Dampish__StellarX-4B-V0/blob/main/results_2023-10-03T17-57-03.227360.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.27266383036075503,
"acc_stderr": 0.03224051042838464,
"acc_norm": 0.27616554014040245,
"acc_norm_stderr": 0.03224574795269476,
"mc1": 0.20685434516523868,
"mc1_stderr": 0.014179591496728343,
"mc2": 0.34296822571733665,
"mc2_stderr": 0.013628027163865984
},
"harness|arc:challenge|25": {
"acc": 0.32337883959044367,
"acc_stderr": 0.013669421630012125,
"acc_norm": 0.36945392491467577,
"acc_norm_stderr": 0.0141045783664919
},
"harness|hellaswag|10": {
"acc": 0.4584744074885481,
"acc_stderr": 0.004972543127767877,
"acc_norm": 0.6190001991635132,
"acc_norm_stderr": 0.004846400325585233
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.03749850709174022,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.03749850709174022
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.34210526315789475,
"acc_stderr": 0.03860731599316092,
"acc_norm": 0.34210526315789475,
"acc_norm_stderr": 0.03860731599316092
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2830188679245283,
"acc_stderr": 0.0277242364927009,
"acc_norm": 0.2830188679245283,
"acc_norm_stderr": 0.0277242364927009
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.24305555555555555,
"acc_stderr": 0.03586879280080341,
"acc_norm": 0.24305555555555555,
"acc_norm_stderr": 0.03586879280080341
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.26011560693641617,
"acc_stderr": 0.03345036916788991,
"acc_norm": 0.26011560693641617,
"acc_norm_stderr": 0.03345036916788991
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617747,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617747
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2851063829787234,
"acc_stderr": 0.02951319662553935,
"acc_norm": 0.2851063829787234,
"acc_norm_stderr": 0.02951319662553935
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022056,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022056
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.296551724137931,
"acc_stderr": 0.03806142687309993,
"acc_norm": 0.296551724137931,
"acc_norm_stderr": 0.03806142687309993
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.26455026455026454,
"acc_stderr": 0.022717467897708617,
"acc_norm": 0.26455026455026454,
"acc_norm_stderr": 0.022717467897708617
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.15079365079365079,
"acc_stderr": 0.03200686497287394,
"acc_norm": 0.15079365079365079,
"acc_norm_stderr": 0.03200686497287394
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.22903225806451613,
"acc_stderr": 0.02390491431178265,
"acc_norm": 0.22903225806451613,
"acc_norm_stderr": 0.02390491431178265
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2512315270935961,
"acc_stderr": 0.030516530732694433,
"acc_norm": 0.2512315270935961,
"acc_norm_stderr": 0.030516530732694433
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.23030303030303031,
"acc_stderr": 0.03287666758603489,
"acc_norm": 0.23030303030303031,
"acc_norm_stderr": 0.03287666758603489
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.35353535353535354,
"acc_stderr": 0.03406086723547153,
"acc_norm": 0.35353535353535354,
"acc_norm_stderr": 0.03406086723547153
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.26424870466321243,
"acc_stderr": 0.031821550509166484,
"acc_norm": 0.26424870466321243,
"acc_norm_stderr": 0.031821550509166484
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2512820512820513,
"acc_stderr": 0.02199201666237056,
"acc_norm": 0.2512820512820513,
"acc_norm_stderr": 0.02199201666237056
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.025348097468097838,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.025348097468097838
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.19327731092436976,
"acc_stderr": 0.0256494702658892,
"acc_norm": 0.19327731092436976,
"acc_norm_stderr": 0.0256494702658892
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.03631329803969653,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.03631329803969653
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3284403669724771,
"acc_stderr": 0.020135902797298395,
"acc_norm": 0.3284403669724771,
"acc_norm_stderr": 0.020135902797298395
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3287037037037037,
"acc_stderr": 0.032036140846700596,
"acc_norm": 0.3287037037037037,
"acc_norm_stderr": 0.032036140846700596
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24019607843137256,
"acc_stderr": 0.02998373305591362,
"acc_norm": 0.24019607843137256,
"acc_norm_stderr": 0.02998373305591362
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.11659192825112108,
"acc_stderr": 0.02153963981624447,
"acc_norm": 0.11659192825112108,
"acc_norm_stderr": 0.02153963981624447
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.24427480916030533,
"acc_stderr": 0.03768335959728744,
"acc_norm": 0.24427480916030533,
"acc_norm_stderr": 0.03768335959728744
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.4049586776859504,
"acc_stderr": 0.044811377559424694,
"acc_norm": 0.4049586776859504,
"acc_norm_stderr": 0.044811377559424694
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.24539877300613497,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.24539877300613497,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2767857142857143,
"acc_stderr": 0.04246624336697625,
"acc_norm": 0.2767857142857143,
"acc_norm_stderr": 0.04246624336697625
},
"harness|hendrycksTest-management|5": {
"acc": 0.30097087378640774,
"acc_stderr": 0.045416094465039476,
"acc_norm": 0.30097087378640774,
"acc_norm_stderr": 0.045416094465039476
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.29914529914529914,
"acc_stderr": 0.029996951858349483,
"acc_norm": 0.29914529914529914,
"acc_norm_stderr": 0.029996951858349483
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.28735632183908044,
"acc_stderr": 0.0161824107306827,
"acc_norm": 0.28735632183908044,
"acc_norm_stderr": 0.0161824107306827
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27262569832402234,
"acc_stderr": 0.014893391735249588,
"acc_norm": 0.27262569832402234,
"acc_norm_stderr": 0.014893391735249588
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.02495418432487991,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.02495418432487991
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.28938906752411575,
"acc_stderr": 0.02575586592263294,
"acc_norm": 0.28938906752411575,
"acc_norm_stderr": 0.02575586592263294
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2654320987654321,
"acc_stderr": 0.024569223600460852,
"acc_norm": 0.2654320987654321,
"acc_norm_stderr": 0.024569223600460852
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2695035460992908,
"acc_stderr": 0.026469036818590627,
"acc_norm": 0.2695035460992908,
"acc_norm_stderr": 0.026469036818590627
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2633637548891786,
"acc_stderr": 0.011249506403605291,
"acc_norm": 0.2633637548891786,
"acc_norm_stderr": 0.011249506403605291
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.2426470588235294,
"acc_stderr": 0.02604066247420126,
"acc_norm": 0.2426470588235294,
"acc_norm_stderr": 0.02604066247420126
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.24019607843137256,
"acc_stderr": 0.017282760695167414,
"acc_norm": 0.24019607843137256,
"acc_norm_stderr": 0.017282760695167414
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.33636363636363636,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.33636363636363636,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.23673469387755103,
"acc_stderr": 0.02721283588407314,
"acc_norm": 0.23673469387755103,
"acc_norm_stderr": 0.02721283588407314
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401468,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401468
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-virology|5": {
"acc": 0.23493975903614459,
"acc_stderr": 0.03300533186128922,
"acc_norm": 0.23493975903614459,
"acc_norm_stderr": 0.03300533186128922
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.29239766081871343,
"acc_stderr": 0.034886477134579215,
"acc_norm": 0.29239766081871343,
"acc_norm_stderr": 0.034886477134579215
},
"harness|truthfulqa:mc|0": {
"mc1": 0.20685434516523868,
"mc1_stderr": 0.014179591496728343,
"mc2": 0.34296822571733665,
"mc2_stderr": 0.013628027163865984
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_AtAndDev__ShortKingv0.1 | 2023-10-03T18:00:56.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of AtAndDev/ShortKingv0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [AtAndDev/ShortKingv0.1](https://huggingface.co/AtAndDev/ShortKingv0.1) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AtAndDev__ShortKingv0.1\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-03T17:59:37.972814](https://huggingface.co/datasets/open-llm-leaderboard/details_AtAndDev__ShortKingv0.1/blob/main/results_2023-10-03T17-59-37.972814.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.26146546047318087,\n\
\ \"acc_stderr\": 0.03185618252607164,\n \"acc_norm\": 0.2641122458661253,\n\
\ \"acc_norm_stderr\": 0.031862975622891546,\n \"mc1\": 0.2460220318237454,\n\
\ \"mc1_stderr\": 0.01507721920066259,\n \"mc2\": 0.41640643343088124,\n\
\ \"mc2_stderr\": 0.014296687341198629\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.30887372013651876,\n \"acc_stderr\": 0.013501770929344006,\n\
\ \"acc_norm\": 0.34215017064846415,\n \"acc_norm_stderr\": 0.013864152159177278\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4230233021310496,\n\
\ \"acc_stderr\": 0.00493029378754561,\n \"acc_norm\": 0.5459071898028282,\n\
\ \"acc_norm_stderr\": 0.004968705270086755\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.35555555555555557,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.35555555555555557,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17105263157894737,\n \"acc_stderr\": 0.030643607071677088,\n\
\ \"acc_norm\": 0.17105263157894737,\n \"acc_norm_stderr\": 0.030643607071677088\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.31,\n\
\ \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2641509433962264,\n \"acc_stderr\": 0.027134291628741702,\n\
\ \"acc_norm\": 0.2641509433962264,\n \"acc_norm_stderr\": 0.027134291628741702\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.03476590104304135,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.03476590104304135\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536955,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536955\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\"\
: 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2138728323699422,\n\
\ \"acc_stderr\": 0.03126511206173043,\n \"acc_norm\": 0.2138728323699422,\n\
\ \"acc_norm_stderr\": 0.03126511206173043\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.04280105837364395,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.04280105837364395\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n\
\ \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.25957446808510637,\n \"acc_stderr\": 0.028659179374292316,\n\
\ \"acc_norm\": 0.25957446808510637,\n \"acc_norm_stderr\": 0.028659179374292316\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\
\ \"acc_stderr\": 0.04096985139843671,\n \"acc_norm\": 0.2543859649122807,\n\
\ \"acc_norm_stderr\": 0.04096985139843671\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.3103448275862069,\n \"acc_stderr\": 0.038552896163789485,\n\
\ \"acc_norm\": 0.3103448275862069,\n \"acc_norm_stderr\": 0.038552896163789485\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2619047619047619,\n \"acc_stderr\": 0.022644212615525214,\n \"\
acc_norm\": 0.2619047619047619,\n \"acc_norm_stderr\": 0.022644212615525214\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.21428571428571427,\n\
\ \"acc_stderr\": 0.03670066451047181,\n \"acc_norm\": 0.21428571428571427,\n\
\ \"acc_norm_stderr\": 0.03670066451047181\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.267741935483871,\n\
\ \"acc_stderr\": 0.025189006660212385,\n \"acc_norm\": 0.267741935483871,\n\
\ \"acc_norm_stderr\": 0.025189006660212385\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.24630541871921183,\n \"acc_stderr\": 0.030315099285617722,\n\
\ \"acc_norm\": 0.24630541871921183,\n \"acc_norm_stderr\": 0.030315099285617722\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.28484848484848485,\n \"acc_stderr\": 0.03524390844511785,\n\
\ \"acc_norm\": 0.28484848484848485,\n \"acc_norm_stderr\": 0.03524390844511785\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.23232323232323232,\n \"acc_stderr\": 0.030088629490217483,\n \"\
acc_norm\": 0.23232323232323232,\n \"acc_norm_stderr\": 0.030088629490217483\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.22797927461139897,\n \"acc_stderr\": 0.030276909945178256,\n\
\ \"acc_norm\": 0.22797927461139897,\n \"acc_norm_stderr\": 0.030276909945178256\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2128205128205128,\n \"acc_stderr\": 0.020752423722128016,\n\
\ \"acc_norm\": 0.2128205128205128,\n \"acc_norm_stderr\": 0.020752423722128016\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24814814814814815,\n \"acc_stderr\": 0.0263357394040558,\n \
\ \"acc_norm\": 0.24814814814814815,\n \"acc_norm_stderr\": 0.0263357394040558\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n\
\ \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2582781456953642,\n \"acc_stderr\": 0.035737053147634576,\n \"\
acc_norm\": 0.2582781456953642,\n \"acc_norm_stderr\": 0.035737053147634576\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.23302752293577983,\n \"acc_stderr\": 0.018125669180861517,\n \"\
acc_norm\": 0.23302752293577983,\n \"acc_norm_stderr\": 0.018125669180861517\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.42592592592592593,\n \"acc_stderr\": 0.03372343271653063,\n \"\
acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.03372343271653063\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.24472573839662448,\n \"acc_stderr\": 0.027985699387036423,\n\
\ \"acc_norm\": 0.24472573839662448,\n \"acc_norm_stderr\": 0.027985699387036423\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.21973094170403587,\n\
\ \"acc_stderr\": 0.0277901770643836,\n \"acc_norm\": 0.21973094170403587,\n\
\ \"acc_norm_stderr\": 0.0277901770643836\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.22137404580152673,\n \"acc_stderr\": 0.0364129708131373,\n\
\ \"acc_norm\": 0.22137404580152673,\n \"acc_norm_stderr\": 0.0364129708131373\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.24793388429752067,\n \"acc_stderr\": 0.039418975265163025,\n \"\
acc_norm\": 0.24793388429752067,\n \"acc_norm_stderr\": 0.039418975265163025\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.28703703703703703,\n\
\ \"acc_stderr\": 0.04373313040914761,\n \"acc_norm\": 0.28703703703703703,\n\
\ \"acc_norm_stderr\": 0.04373313040914761\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3006134969325153,\n \"acc_stderr\": 0.03602511318806771,\n\
\ \"acc_norm\": 0.3006134969325153,\n \"acc_norm_stderr\": 0.03602511318806771\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04287858751340455,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04287858751340455\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.24358974358974358,\n\
\ \"acc_stderr\": 0.028120966503914407,\n \"acc_norm\": 0.24358974358974358,\n\
\ \"acc_norm_stderr\": 0.028120966503914407\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2567049808429119,\n\
\ \"acc_stderr\": 0.015620480263064536,\n \"acc_norm\": 0.2567049808429119,\n\
\ \"acc_norm_stderr\": 0.015620480263064536\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2658959537572254,\n \"acc_stderr\": 0.02378620325550828,\n\
\ \"acc_norm\": 0.2658959537572254,\n \"acc_norm_stderr\": 0.02378620325550828\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.25163398692810457,\n \"acc_stderr\": 0.0248480182638752,\n\
\ \"acc_norm\": 0.25163398692810457,\n \"acc_norm_stderr\": 0.0248480182638752\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2572347266881029,\n\
\ \"acc_stderr\": 0.024826171289250888,\n \"acc_norm\": 0.2572347266881029,\n\
\ \"acc_norm_stderr\": 0.024826171289250888\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2808641975308642,\n \"acc_stderr\": 0.025006469755799208,\n\
\ \"acc_norm\": 0.2808641975308642,\n \"acc_norm_stderr\": 0.025006469755799208\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.24113475177304963,\n \"acc_stderr\": 0.025518731049537776,\n \
\ \"acc_norm\": 0.24113475177304963,\n \"acc_norm_stderr\": 0.025518731049537776\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n\
\ \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n\
\ \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.22058823529411764,\n \"acc_stderr\": 0.02518778666022727,\n\
\ \"acc_norm\": 0.22058823529411764,\n \"acc_norm_stderr\": 0.02518778666022727\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2581699346405229,\n \"acc_stderr\": 0.017704531653250068,\n \
\ \"acc_norm\": 0.2581699346405229,\n \"acc_norm_stderr\": 0.017704531653250068\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.24545454545454545,\n\
\ \"acc_stderr\": 0.04122066502878285,\n \"acc_norm\": 0.24545454545454545,\n\
\ \"acc_norm_stderr\": 0.04122066502878285\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.17142857142857143,\n \"acc_stderr\": 0.02412746346265014,\n\
\ \"acc_norm\": 0.17142857142857143,\n \"acc_norm_stderr\": 0.02412746346265014\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.27860696517412936,\n\
\ \"acc_stderr\": 0.031700561834973086,\n \"acc_norm\": 0.27860696517412936,\n\
\ \"acc_norm_stderr\": 0.031700561834973086\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2469879518072289,\n\
\ \"acc_stderr\": 0.03357351982064537,\n \"acc_norm\": 0.2469879518072289,\n\
\ \"acc_norm_stderr\": 0.03357351982064537\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.2807017543859649,\n \"acc_stderr\": 0.034462962170884265,\n\
\ \"acc_norm\": 0.2807017543859649,\n \"acc_norm_stderr\": 0.034462962170884265\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2460220318237454,\n\
\ \"mc1_stderr\": 0.01507721920066259,\n \"mc2\": 0.41640643343088124,\n\
\ \"mc2_stderr\": 0.014296687341198629\n }\n}\n```"
repo_url: https://huggingface.co/AtAndDev/ShortKingv0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|arc:challenge|25_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hellaswag|10_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T17-59-37.972814.parquet'
- config_name: results
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- results_2023-10-03T17-59-37.972814.parquet
- split: latest
path:
- results_2023-10-03T17-59-37.972814.parquet
---
# Dataset Card for Evaluation run of AtAndDev/ShortKingv0.1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/AtAndDev/ShortKingv0.1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [AtAndDev/ShortKingv0.1](https://huggingface.co/AtAndDev/ShortKingv0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AtAndDev__ShortKingv0.1",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-03T17:59:37.972814](https://huggingface.co/datasets/open-llm-leaderboard/details_AtAndDev__ShortKingv0.1/blob/main/results_2023-10-03T17-59-37.972814.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.26146546047318087,
"acc_stderr": 0.03185618252607164,
"acc_norm": 0.2641122458661253,
"acc_norm_stderr": 0.031862975622891546,
"mc1": 0.2460220318237454,
"mc1_stderr": 0.01507721920066259,
"mc2": 0.41640643343088124,
"mc2_stderr": 0.014296687341198629
},
"harness|arc:challenge|25": {
"acc": 0.30887372013651876,
"acc_stderr": 0.013501770929344006,
"acc_norm": 0.34215017064846415,
"acc_norm_stderr": 0.013864152159177278
},
"harness|hellaswag|10": {
"acc": 0.4230233021310496,
"acc_stderr": 0.00493029378754561,
"acc_norm": 0.5459071898028282,
"acc_norm_stderr": 0.004968705270086755
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17105263157894737,
"acc_stderr": 0.030643607071677088,
"acc_norm": 0.17105263157894737,
"acc_norm_stderr": 0.030643607071677088
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2641509433962264,
"acc_stderr": 0.027134291628741702,
"acc_norm": 0.2641509433962264,
"acc_norm_stderr": 0.027134291628741702
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.03476590104304135,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.03476590104304135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2138728323699422,
"acc_stderr": 0.03126511206173043,
"acc_norm": 0.2138728323699422,
"acc_norm_stderr": 0.03126511206173043
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.04280105837364395,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.04280105837364395
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.25957446808510637,
"acc_stderr": 0.028659179374292316,
"acc_norm": 0.25957446808510637,
"acc_norm_stderr": 0.028659179374292316
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.04096985139843671,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.04096985139843671
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.3103448275862069,
"acc_stderr": 0.038552896163789485,
"acc_norm": 0.3103448275862069,
"acc_norm_stderr": 0.038552896163789485
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.022644212615525214,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.022644212615525214
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.03670066451047181,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.03670066451047181
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.267741935483871,
"acc_stderr": 0.025189006660212385,
"acc_norm": 0.267741935483871,
"acc_norm_stderr": 0.025189006660212385
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.24630541871921183,
"acc_stderr": 0.030315099285617722,
"acc_norm": 0.24630541871921183,
"acc_norm_stderr": 0.030315099285617722
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.28484848484848485,
"acc_stderr": 0.03524390844511785,
"acc_norm": 0.28484848484848485,
"acc_norm_stderr": 0.03524390844511785
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.23232323232323232,
"acc_stderr": 0.030088629490217483,
"acc_norm": 0.23232323232323232,
"acc_norm_stderr": 0.030088629490217483
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.22797927461139897,
"acc_stderr": 0.030276909945178256,
"acc_norm": 0.22797927461139897,
"acc_norm_stderr": 0.030276909945178256
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2128205128205128,
"acc_stderr": 0.020752423722128016,
"acc_norm": 0.2128205128205128,
"acc_norm_stderr": 0.020752423722128016
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24814814814814815,
"acc_stderr": 0.0263357394040558,
"acc_norm": 0.24814814814814815,
"acc_norm_stderr": 0.0263357394040558
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2582781456953642,
"acc_stderr": 0.035737053147634576,
"acc_norm": 0.2582781456953642,
"acc_norm_stderr": 0.035737053147634576
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.23302752293577983,
"acc_stderr": 0.018125669180861517,
"acc_norm": 0.23302752293577983,
"acc_norm_stderr": 0.018125669180861517
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.03372343271653063,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.03372343271653063
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.24472573839662448,
"acc_stderr": 0.027985699387036423,
"acc_norm": 0.24472573839662448,
"acc_norm_stderr": 0.027985699387036423
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.21973094170403587,
"acc_stderr": 0.0277901770643836,
"acc_norm": 0.21973094170403587,
"acc_norm_stderr": 0.0277901770643836
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.22137404580152673,
"acc_stderr": 0.0364129708131373,
"acc_norm": 0.22137404580152673,
"acc_norm_stderr": 0.0364129708131373
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.24793388429752067,
"acc_stderr": 0.039418975265163025,
"acc_norm": 0.24793388429752067,
"acc_norm_stderr": 0.039418975265163025
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.28703703703703703,
"acc_stderr": 0.04373313040914761,
"acc_norm": 0.28703703703703703,
"acc_norm_stderr": 0.04373313040914761
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3006134969325153,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.3006134969325153,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04287858751340455,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04287858751340455
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.24358974358974358,
"acc_stderr": 0.028120966503914407,
"acc_norm": 0.24358974358974358,
"acc_norm_stderr": 0.028120966503914407
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2567049808429119,
"acc_stderr": 0.015620480263064536,
"acc_norm": 0.2567049808429119,
"acc_norm_stderr": 0.015620480263064536
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2658959537572254,
"acc_stderr": 0.02378620325550828,
"acc_norm": 0.2658959537572254,
"acc_norm_stderr": 0.02378620325550828
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.25163398692810457,
"acc_stderr": 0.0248480182638752,
"acc_norm": 0.25163398692810457,
"acc_norm_stderr": 0.0248480182638752
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2572347266881029,
"acc_stderr": 0.024826171289250888,
"acc_norm": 0.2572347266881029,
"acc_norm_stderr": 0.024826171289250888
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2808641975308642,
"acc_stderr": 0.025006469755799208,
"acc_norm": 0.2808641975308642,
"acc_norm_stderr": 0.025006469755799208
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.24113475177304963,
"acc_stderr": 0.025518731049537776,
"acc_norm": 0.24113475177304963,
"acc_norm_stderr": 0.025518731049537776
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.22058823529411764,
"acc_stderr": 0.02518778666022727,
"acc_norm": 0.22058823529411764,
"acc_norm_stderr": 0.02518778666022727
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2581699346405229,
"acc_stderr": 0.017704531653250068,
"acc_norm": 0.2581699346405229,
"acc_norm_stderr": 0.017704531653250068
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.24545454545454545,
"acc_stderr": 0.04122066502878285,
"acc_norm": 0.24545454545454545,
"acc_norm_stderr": 0.04122066502878285
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.17142857142857143,
"acc_stderr": 0.02412746346265014,
"acc_norm": 0.17142857142857143,
"acc_norm_stderr": 0.02412746346265014
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.27860696517412936,
"acc_stderr": 0.031700561834973086,
"acc_norm": 0.27860696517412936,
"acc_norm_stderr": 0.031700561834973086
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2469879518072289,
"acc_stderr": 0.03357351982064537,
"acc_norm": 0.2469879518072289,
"acc_norm_stderr": 0.03357351982064537
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.034462962170884265,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.034462962170884265
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2460220318237454,
"mc1_stderr": 0.01507721920066259,
"mc2": 0.41640643343088124,
"mc2_stderr": 0.014296687341198629
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_BramVanroy__Llama-2-13b-chat-dutch | 2023-10-03T18:09:38.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of BramVanroy/Llama-2-13b-chat-dutch
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [BramVanroy/Llama-2-13b-chat-dutch](https://huggingface.co/BramVanroy/Llama-2-13b-chat-dutch)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BramVanroy__Llama-2-13b-chat-dutch\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-03T18:08:13.956421](https://huggingface.co/datasets/open-llm-leaderboard/details_BramVanroy__Llama-2-13b-chat-dutch/blob/main/results_2023-10-03T18-08-13.956421.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5587396973671402,\n\
\ \"acc_stderr\": 0.03435143178969088,\n \"acc_norm\": 0.5631498811377921,\n\
\ \"acc_norm_stderr\": 0.034331048533414704,\n \"mc1\": 0.27050183598531213,\n\
\ \"mc1_stderr\": 0.0155507783328429,\n \"mc2\": 0.3822604426807352,\n\
\ \"mc2_stderr\": 0.014035906831583136\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.537542662116041,\n \"acc_stderr\": 0.01457014449507558,\n\
\ \"acc_norm\": 0.5930034129692833,\n \"acc_norm_stderr\": 0.014356399418009124\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6097390957976498,\n\
\ \"acc_stderr\": 0.004868117598481945,\n \"acc_norm\": 0.814479187412866,\n\
\ \"acc_norm_stderr\": 0.003879250555254524\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n\
\ \"acc_stderr\": 0.04313531696750574,\n \"acc_norm\": 0.4740740740740741,\n\
\ \"acc_norm_stderr\": 0.04313531696750574\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5657894736842105,\n \"acc_stderr\": 0.040335656678483184,\n\
\ \"acc_norm\": 0.5657894736842105,\n \"acc_norm_stderr\": 0.040335656678483184\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6075471698113207,\n \"acc_stderr\": 0.03005258057955785,\n\
\ \"acc_norm\": 0.6075471698113207,\n \"acc_norm_stderr\": 0.03005258057955785\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5416666666666666,\n\
\ \"acc_stderr\": 0.04166666666666665,\n \"acc_norm\": 0.5416666666666666,\n\
\ \"acc_norm_stderr\": 0.04166666666666665\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.44,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5491329479768786,\n\
\ \"acc_stderr\": 0.03794012674697029,\n \"acc_norm\": 0.5491329479768786,\n\
\ \"acc_norm_stderr\": 0.03794012674697029\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.04280105837364396,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.04280105837364396\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.72,\n \"acc_stderr\": 0.04512608598542129,\n \"acc_norm\": 0.72,\n\
\ \"acc_norm_stderr\": 0.04512608598542129\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4553191489361702,\n \"acc_stderr\": 0.03255525359340355,\n\
\ \"acc_norm\": 0.4553191489361702,\n \"acc_norm_stderr\": 0.03255525359340355\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n\
\ \"acc_stderr\": 0.043036840335373146,\n \"acc_norm\": 0.2982456140350877,\n\
\ \"acc_norm_stderr\": 0.043036840335373146\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n\
\ \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3253968253968254,\n \"acc_stderr\": 0.024130158299762616,\n \"\
acc_norm\": 0.3253968253968254,\n \"acc_norm_stderr\": 0.024130158299762616\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30952380952380953,\n\
\ \"acc_stderr\": 0.04134913018303316,\n \"acc_norm\": 0.30952380952380953,\n\
\ \"acc_norm_stderr\": 0.04134913018303316\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.667741935483871,\n\
\ \"acc_stderr\": 0.026795560848122804,\n \"acc_norm\": 0.667741935483871,\n\
\ \"acc_norm_stderr\": 0.026795560848122804\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4630541871921182,\n \"acc_stderr\": 0.035083705204426656,\n\
\ \"acc_norm\": 0.4630541871921182,\n \"acc_norm_stderr\": 0.035083705204426656\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.037131580674819135,\n\
\ \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.037131580674819135\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.696969696969697,\n \"acc_stderr\": 0.032742879140268674,\n \"\
acc_norm\": 0.696969696969697,\n \"acc_norm_stderr\": 0.032742879140268674\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7979274611398963,\n \"acc_stderr\": 0.02897908979429673,\n\
\ \"acc_norm\": 0.7979274611398963,\n \"acc_norm_stderr\": 0.02897908979429673\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5358974358974359,\n \"acc_stderr\": 0.025285585990017845,\n\
\ \"acc_norm\": 0.5358974358974359,\n \"acc_norm_stderr\": 0.025285585990017845\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2962962962962963,\n \"acc_stderr\": 0.027840811495871923,\n \
\ \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.027840811495871923\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5714285714285714,\n \"acc_stderr\": 0.032145368597886394,\n\
\ \"acc_norm\": 0.5714285714285714,\n \"acc_norm_stderr\": 0.032145368597886394\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7467889908256881,\n \"acc_stderr\": 0.018644073041375043,\n \"\
acc_norm\": 0.7467889908256881,\n \"acc_norm_stderr\": 0.018644073041375043\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4351851851851852,\n \"acc_stderr\": 0.03381200005643524,\n \"\
acc_norm\": 0.4351851851851852,\n \"acc_norm_stderr\": 0.03381200005643524\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7598039215686274,\n \"acc_stderr\": 0.02998373305591362,\n \"\
acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.02998373305591362\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7172995780590717,\n \"acc_stderr\": 0.029312814153955938,\n \
\ \"acc_norm\": 0.7172995780590717,\n \"acc_norm_stderr\": 0.029312814153955938\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6278026905829597,\n\
\ \"acc_stderr\": 0.03244305283008731,\n \"acc_norm\": 0.6278026905829597,\n\
\ \"acc_norm_stderr\": 0.03244305283008731\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6259541984732825,\n \"acc_stderr\": 0.042438692422305246,\n\
\ \"acc_norm\": 0.6259541984732825,\n \"acc_norm_stderr\": 0.042438692422305246\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7355371900826446,\n \"acc_stderr\": 0.04026187527591207,\n \"\
acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.04026187527591207\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7116564417177914,\n \"acc_stderr\": 0.035590395316173425,\n\
\ \"acc_norm\": 0.7116564417177914,\n \"acc_norm_stderr\": 0.035590395316173425\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2767857142857143,\n\
\ \"acc_stderr\": 0.04246624336697625,\n \"acc_norm\": 0.2767857142857143,\n\
\ \"acc_norm_stderr\": 0.04246624336697625\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260595,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260595\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7863247863247863,\n\
\ \"acc_stderr\": 0.02685345037700917,\n \"acc_norm\": 0.7863247863247863,\n\
\ \"acc_norm_stderr\": 0.02685345037700917\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7484035759897829,\n\
\ \"acc_stderr\": 0.015517322365529636,\n \"acc_norm\": 0.7484035759897829,\n\
\ \"acc_norm_stderr\": 0.015517322365529636\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6329479768786127,\n \"acc_stderr\": 0.025950054337654082,\n\
\ \"acc_norm\": 0.6329479768786127,\n \"acc_norm_stderr\": 0.025950054337654082\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3553072625698324,\n\
\ \"acc_stderr\": 0.016006989934803196,\n \"acc_norm\": 0.3553072625698324,\n\
\ \"acc_norm_stderr\": 0.016006989934803196\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.027914055510468,\n\
\ \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.027914055510468\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6527331189710611,\n\
\ \"acc_stderr\": 0.027040745502307336,\n \"acc_norm\": 0.6527331189710611,\n\
\ \"acc_norm_stderr\": 0.027040745502307336\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.654320987654321,\n \"acc_stderr\": 0.026462487777001872,\n\
\ \"acc_norm\": 0.654320987654321,\n \"acc_norm_stderr\": 0.026462487777001872\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.40070921985815605,\n \"acc_stderr\": 0.029233465745573083,\n \
\ \"acc_norm\": 0.40070921985815605,\n \"acc_norm_stderr\": 0.029233465745573083\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41395045632333766,\n\
\ \"acc_stderr\": 0.012579699631289264,\n \"acc_norm\": 0.41395045632333766,\n\
\ \"acc_norm_stderr\": 0.012579699631289264\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5220588235294118,\n \"acc_stderr\": 0.030343264224213514,\n\
\ \"acc_norm\": 0.5220588235294118,\n \"acc_norm_stderr\": 0.030343264224213514\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5571895424836601,\n \"acc_stderr\": 0.020095083154577344,\n \
\ \"acc_norm\": 0.5571895424836601,\n \"acc_norm_stderr\": 0.020095083154577344\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.046075820907199756,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.046075820907199756\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5959183673469388,\n \"acc_stderr\": 0.031414708025865885,\n\
\ \"acc_norm\": 0.5959183673469388,\n \"acc_norm_stderr\": 0.031414708025865885\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7512437810945274,\n\
\ \"acc_stderr\": 0.030567675938916714,\n \"acc_norm\": 0.7512437810945274,\n\
\ \"acc_norm_stderr\": 0.030567675938916714\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.46987951807228917,\n\
\ \"acc_stderr\": 0.03885425420866766,\n \"acc_norm\": 0.46987951807228917,\n\
\ \"acc_norm_stderr\": 0.03885425420866766\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7719298245614035,\n \"acc_stderr\": 0.032180937956023566,\n\
\ \"acc_norm\": 0.7719298245614035,\n \"acc_norm_stderr\": 0.032180937956023566\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.27050183598531213,\n\
\ \"mc1_stderr\": 0.0155507783328429,\n \"mc2\": 0.3822604426807352,\n\
\ \"mc2_stderr\": 0.014035906831583136\n }\n}\n```"
repo_url: https://huggingface.co/BramVanroy/Llama-2-13b-chat-dutch
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_03T18_08_13.956421
path:
- '**/details_harness|arc:challenge|25_2023-10-03T18-08-13.956421.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-03T18-08-13.956421.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_03T18_08_13.956421
path:
- '**/details_harness|hellaswag|10_2023-10-03T18-08-13.956421.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-03T18-08-13.956421.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_03T18_08_13.956421
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T18-08-13.956421.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T18-08-13.956421.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T18-08-13.956421.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_03T18_08_13.956421
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T18-08-13.956421.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T18-08-13.956421.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_03T18_08_13.956421
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T18-08-13.956421.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T18-08-13.956421.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_03T18_08_13.956421
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T18-08-13.956421.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T18-08-13.956421.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_03T18_08_13.956421
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T18-08-13.956421.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T18-08-13.956421.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_03T18_08_13.956421
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T18-08-13.956421.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T18-08-13.956421.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_03T18_08_13.956421
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T18-08-13.956421.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T18-08-13.956421.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_03T18_08_13.956421
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T18-08-13.956421.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T18-08-13.956421.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_03T18_08_13.956421
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T18-08-13.956421.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T18-08-13.956421.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_03T18_08_13.956421
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T18-08-13.956421.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T18-08-13.956421.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_03T18_08_13.956421
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T18-08-13.956421.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T18-08-13.956421.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_03T18_08_13.956421
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T18-08-13.956421.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T18-08-13.956421.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_03T18_08_13.956421
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T18-08-13.956421.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T18-08-13.956421.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_03T18_08_13.956421
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T18-08-13.956421.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T18-08-13.956421.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_03T18_08_13.956421
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T18-08-13.956421.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T18-08-13.956421.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_03T18_08_13.956421
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T18-08-13.956421.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T18-08-13.956421.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_03T18_08_13.956421
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T18-08-13.956421.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T18-08-13.956421.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_03T18_08_13.956421
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T18-08-13.956421.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T18-08-13.956421.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_03T18_08_13.956421
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T18-08-13.956421.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T18-08-13.956421.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_03T18_08_13.956421
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T18-08-13.956421.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T18-08-13.956421.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_03T18_08_13.956421
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T18-08-13.956421.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T18-08-13.956421.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_03T18_08_13.956421
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T18-08-13.956421.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T18-08-13.956421.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_03T18_08_13.956421
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T18-08-13.956421.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T18-08-13.956421.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_03T18_08_13.956421
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T18-08-13.956421.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T18-08-13.956421.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_03T18_08_13.956421
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T18-08-13.956421.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T18-08-13.956421.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_03T18_08_13.956421
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T18-08-13.956421.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T18-08-13.956421.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_03T18_08_13.956421
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T18-08-13.956421.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T18-08-13.956421.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_03T18_08_13.956421
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T18-08-13.956421.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T18-08-13.956421.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_03T18_08_13.956421
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T18-08-13.956421.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T18-08-13.956421.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_03T18_08_13.956421
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T18-08-13.956421.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T18-08-13.956421.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_03T18_08_13.956421
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T18-08-13.956421.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T18-08-13.956421.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_03T18_08_13.956421
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T18-08-13.956421.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T18-08-13.956421.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_03T18_08_13.956421
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T18-08-13.956421.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T18-08-13.956421.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_03T18_08_13.956421
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T18-08-13.956421.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T18-08-13.956421.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_03T18_08_13.956421
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T18-08-13.956421.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T18-08-13.956421.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_03T18_08_13.956421
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T18-08-13.956421.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T18-08-13.956421.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_03T18_08_13.956421
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T18-08-13.956421.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T18-08-13.956421.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_03T18_08_13.956421
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T18-08-13.956421.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T18-08-13.956421.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_03T18_08_13.956421
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T18-08-13.956421.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T18-08-13.956421.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_03T18_08_13.956421
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T18-08-13.956421.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T18-08-13.956421.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_03T18_08_13.956421
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T18-08-13.956421.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T18-08-13.956421.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_03T18_08_13.956421
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T18-08-13.956421.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T18-08-13.956421.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_03T18_08_13.956421
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T18-08-13.956421.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T18-08-13.956421.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_03T18_08_13.956421
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T18-08-13.956421.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T18-08-13.956421.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_03T18_08_13.956421
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T18-08-13.956421.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T18-08-13.956421.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_03T18_08_13.956421
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T18-08-13.956421.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T18-08-13.956421.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_03T18_08_13.956421
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T18-08-13.956421.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T18-08-13.956421.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_03T18_08_13.956421
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T18-08-13.956421.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T18-08-13.956421.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_03T18_08_13.956421
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T18-08-13.956421.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T18-08-13.956421.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_03T18_08_13.956421
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T18-08-13.956421.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T18-08-13.956421.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_03T18_08_13.956421
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T18-08-13.956421.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T18-08-13.956421.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_03T18_08_13.956421
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T18-08-13.956421.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T18-08-13.956421.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_03T18_08_13.956421
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T18-08-13.956421.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T18-08-13.956421.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_03T18_08_13.956421
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T18-08-13.956421.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T18-08-13.956421.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_03T18_08_13.956421
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T18-08-13.956421.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T18-08-13.956421.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_03T18_08_13.956421
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T18-08-13.956421.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T18-08-13.956421.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_03T18_08_13.956421
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T18-08-13.956421.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T18-08-13.956421.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_03T18_08_13.956421
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T18-08-13.956421.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T18-08-13.956421.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_03T18_08_13.956421
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T18-08-13.956421.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T18-08-13.956421.parquet'
- config_name: results
data_files:
- split: 2023_10_03T18_08_13.956421
path:
- results_2023-10-03T18-08-13.956421.parquet
- split: latest
path:
- results_2023-10-03T18-08-13.956421.parquet
---
# Dataset Card for Evaluation run of BramVanroy/Llama-2-13b-chat-dutch
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/BramVanroy/Llama-2-13b-chat-dutch
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [BramVanroy/Llama-2-13b-chat-dutch](https://huggingface.co/BramVanroy/Llama-2-13b-chat-dutch) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_BramVanroy__Llama-2-13b-chat-dutch",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-03T18:08:13.956421](https://huggingface.co/datasets/open-llm-leaderboard/details_BramVanroy__Llama-2-13b-chat-dutch/blob/main/results_2023-10-03T18-08-13.956421.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5587396973671402,
"acc_stderr": 0.03435143178969088,
"acc_norm": 0.5631498811377921,
"acc_norm_stderr": 0.034331048533414704,
"mc1": 0.27050183598531213,
"mc1_stderr": 0.0155507783328429,
"mc2": 0.3822604426807352,
"mc2_stderr": 0.014035906831583136
},
"harness|arc:challenge|25": {
"acc": 0.537542662116041,
"acc_stderr": 0.01457014449507558,
"acc_norm": 0.5930034129692833,
"acc_norm_stderr": 0.014356399418009124
},
"harness|hellaswag|10": {
"acc": 0.6097390957976498,
"acc_stderr": 0.004868117598481945,
"acc_norm": 0.814479187412866,
"acc_norm_stderr": 0.003879250555254524
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4740740740740741,
"acc_stderr": 0.04313531696750574,
"acc_norm": 0.4740740740740741,
"acc_norm_stderr": 0.04313531696750574
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5657894736842105,
"acc_stderr": 0.040335656678483184,
"acc_norm": 0.5657894736842105,
"acc_norm_stderr": 0.040335656678483184
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6075471698113207,
"acc_stderr": 0.03005258057955785,
"acc_norm": 0.6075471698113207,
"acc_norm_stderr": 0.03005258057955785
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.04166666666666665,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.04166666666666665
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5491329479768786,
"acc_stderr": 0.03794012674697029,
"acc_norm": 0.5491329479768786,
"acc_norm_stderr": 0.03794012674697029
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.04280105837364396,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.04280105837364396
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4553191489361702,
"acc_stderr": 0.03255525359340355,
"acc_norm": 0.4553191489361702,
"acc_norm_stderr": 0.03255525359340355
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.043036840335373146,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.043036840335373146
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3253968253968254,
"acc_stderr": 0.024130158299762616,
"acc_norm": 0.3253968253968254,
"acc_norm_stderr": 0.024130158299762616
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.04134913018303316,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.04134913018303316
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.667741935483871,
"acc_stderr": 0.026795560848122804,
"acc_norm": 0.667741935483871,
"acc_norm_stderr": 0.026795560848122804
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4630541871921182,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.4630541871921182,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.037131580674819135,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.037131580674819135
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.696969696969697,
"acc_stderr": 0.032742879140268674,
"acc_norm": 0.696969696969697,
"acc_norm_stderr": 0.032742879140268674
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7979274611398963,
"acc_stderr": 0.02897908979429673,
"acc_norm": 0.7979274611398963,
"acc_norm_stderr": 0.02897908979429673
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5358974358974359,
"acc_stderr": 0.025285585990017845,
"acc_norm": 0.5358974358974359,
"acc_norm_stderr": 0.025285585990017845
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.027840811495871923,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.027840811495871923
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.032145368597886394,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.032145368597886394
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7467889908256881,
"acc_stderr": 0.018644073041375043,
"acc_norm": 0.7467889908256881,
"acc_norm_stderr": 0.018644073041375043
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4351851851851852,
"acc_stderr": 0.03381200005643524,
"acc_norm": 0.4351851851851852,
"acc_norm_stderr": 0.03381200005643524
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7598039215686274,
"acc_stderr": 0.02998373305591362,
"acc_norm": 0.7598039215686274,
"acc_norm_stderr": 0.02998373305591362
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7172995780590717,
"acc_stderr": 0.029312814153955938,
"acc_norm": 0.7172995780590717,
"acc_norm_stderr": 0.029312814153955938
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6278026905829597,
"acc_stderr": 0.03244305283008731,
"acc_norm": 0.6278026905829597,
"acc_norm_stderr": 0.03244305283008731
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6259541984732825,
"acc_stderr": 0.042438692422305246,
"acc_norm": 0.6259541984732825,
"acc_norm_stderr": 0.042438692422305246
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.04026187527591207,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.04026187527591207
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243838,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243838
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7116564417177914,
"acc_stderr": 0.035590395316173425,
"acc_norm": 0.7116564417177914,
"acc_norm_stderr": 0.035590395316173425
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2767857142857143,
"acc_stderr": 0.04246624336697625,
"acc_norm": 0.2767857142857143,
"acc_norm_stderr": 0.04246624336697625
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.04354631077260595,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.04354631077260595
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7863247863247863,
"acc_stderr": 0.02685345037700917,
"acc_norm": 0.7863247863247863,
"acc_norm_stderr": 0.02685345037700917
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7484035759897829,
"acc_stderr": 0.015517322365529636,
"acc_norm": 0.7484035759897829,
"acc_norm_stderr": 0.015517322365529636
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6329479768786127,
"acc_stderr": 0.025950054337654082,
"acc_norm": 0.6329479768786127,
"acc_norm_stderr": 0.025950054337654082
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3553072625698324,
"acc_stderr": 0.016006989934803196,
"acc_norm": 0.3553072625698324,
"acc_norm_stderr": 0.016006989934803196
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.027914055510468,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.027914055510468
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6527331189710611,
"acc_stderr": 0.027040745502307336,
"acc_norm": 0.6527331189710611,
"acc_norm_stderr": 0.027040745502307336
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.654320987654321,
"acc_stderr": 0.026462487777001872,
"acc_norm": 0.654320987654321,
"acc_norm_stderr": 0.026462487777001872
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.40070921985815605,
"acc_stderr": 0.029233465745573083,
"acc_norm": 0.40070921985815605,
"acc_norm_stderr": 0.029233465745573083
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.41395045632333766,
"acc_stderr": 0.012579699631289264,
"acc_norm": 0.41395045632333766,
"acc_norm_stderr": 0.012579699631289264
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5220588235294118,
"acc_stderr": 0.030343264224213514,
"acc_norm": 0.5220588235294118,
"acc_norm_stderr": 0.030343264224213514
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5571895424836601,
"acc_stderr": 0.020095083154577344,
"acc_norm": 0.5571895424836601,
"acc_norm_stderr": 0.020095083154577344
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.046075820907199756,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.046075820907199756
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5959183673469388,
"acc_stderr": 0.031414708025865885,
"acc_norm": 0.5959183673469388,
"acc_norm_stderr": 0.031414708025865885
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7512437810945274,
"acc_stderr": 0.030567675938916714,
"acc_norm": 0.7512437810945274,
"acc_norm_stderr": 0.030567675938916714
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.46987951807228917,
"acc_stderr": 0.03885425420866766,
"acc_norm": 0.46987951807228917,
"acc_norm_stderr": 0.03885425420866766
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7719298245614035,
"acc_stderr": 0.032180937956023566,
"acc_norm": 0.7719298245614035,
"acc_norm_stderr": 0.032180937956023566
},
"harness|truthfulqa:mc|0": {
"mc1": 0.27050183598531213,
"mc1_stderr": 0.0155507783328429,
"mc2": 0.3822604426807352,
"mc2_stderr": 0.014035906831583136
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_BramVanroy__llama2-13b-ft-mc4_nl_cleaned_tiny | 2023-10-03T18:15:54.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of BramVanroy/llama2-13b-ft-mc4_nl_cleaned_tiny
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [BramVanroy/llama2-13b-ft-mc4_nl_cleaned_tiny](https://huggingface.co/BramVanroy/llama2-13b-ft-mc4_nl_cleaned_tiny)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BramVanroy__llama2-13b-ft-mc4_nl_cleaned_tiny\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-03T18:14:29.012381](https://huggingface.co/datasets/open-llm-leaderboard/details_BramVanroy__llama2-13b-ft-mc4_nl_cleaned_tiny/blob/main/results_2023-10-03T18-14-29.012381.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5478041931115556,\n\
\ \"acc_stderr\": 0.034421246552364795,\n \"acc_norm\": 0.552100869117797,\n\
\ \"acc_norm_stderr\": 0.03440056394916386,\n \"mc1\": 0.2607099143206854,\n\
\ \"mc1_stderr\": 0.015368841620766373,\n \"mc2\": 0.3803125479959119,\n\
\ \"mc2_stderr\": 0.013645567872439613\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5469283276450512,\n \"acc_stderr\": 0.014546892052005628,\n\
\ \"acc_norm\": 0.5930034129692833,\n \"acc_norm_stderr\": 0.01435639941800912\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6129257120095598,\n\
\ \"acc_stderr\": 0.004860854240821963,\n \"acc_norm\": 0.8203545110535749,\n\
\ \"acc_norm_stderr\": 0.0038310732859630774\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.04605661864718381,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.04605661864718381\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4666666666666667,\n\
\ \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.4666666666666667,\n\
\ \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5394736842105263,\n \"acc_stderr\": 0.04056242252249033,\n\
\ \"acc_norm\": 0.5394736842105263,\n \"acc_norm_stderr\": 0.04056242252249033\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6037735849056604,\n \"acc_stderr\": 0.030102793781791194,\n\
\ \"acc_norm\": 0.6037735849056604,\n \"acc_norm_stderr\": 0.030102793781791194\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5486111111111112,\n\
\ \"acc_stderr\": 0.04161402398403279,\n \"acc_norm\": 0.5486111111111112,\n\
\ \"acc_norm_stderr\": 0.04161402398403279\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n\
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4913294797687861,\n\
\ \"acc_stderr\": 0.03811890988940412,\n \"acc_norm\": 0.4913294797687861,\n\
\ \"acc_norm_stderr\": 0.03811890988940412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.04280105837364395,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.04280105837364395\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.451063829787234,\n \"acc_stderr\": 0.032529096196131965,\n\
\ \"acc_norm\": 0.451063829787234,\n \"acc_norm_stderr\": 0.032529096196131965\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\
\ \"acc_stderr\": 0.040969851398436716,\n \"acc_norm\": 0.2543859649122807,\n\
\ \"acc_norm_stderr\": 0.040969851398436716\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4689655172413793,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.4689655172413793,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.31216931216931215,\n \"acc_stderr\": 0.023865206836972606,\n \"\
acc_norm\": 0.31216931216931215,\n \"acc_norm_stderr\": 0.023865206836972606\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.31746031746031744,\n\
\ \"acc_stderr\": 0.04163453031302859,\n \"acc_norm\": 0.31746031746031744,\n\
\ \"acc_norm_stderr\": 0.04163453031302859\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.667741935483871,\n \"acc_stderr\": 0.0267955608481228,\n \"acc_norm\"\
: 0.667741935483871,\n \"acc_norm_stderr\": 0.0267955608481228\n },\n\
\ \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.41379310344827586,\n\
\ \"acc_stderr\": 0.03465304488406796,\n \"acc_norm\": 0.41379310344827586,\n\
\ \"acc_norm_stderr\": 0.03465304488406796\n },\n \"harness|hendrycksTest-high_school_computer_science|5\"\
: {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.037563357751878974,\n\
\ \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.037563357751878974\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6666666666666666,\n \"acc_stderr\": 0.03358618145732522,\n \"\
acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03358618145732522\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7875647668393783,\n \"acc_stderr\": 0.029519282616817234,\n\
\ \"acc_norm\": 0.7875647668393783,\n \"acc_norm_stderr\": 0.029519282616817234\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5076923076923077,\n \"acc_stderr\": 0.025348006031534778,\n\
\ \"acc_norm\": 0.5076923076923077,\n \"acc_norm_stderr\": 0.025348006031534778\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26296296296296295,\n \"acc_stderr\": 0.026842057873833706,\n \
\ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.026842057873833706\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5630252100840336,\n \"acc_stderr\": 0.03221943636566195,\n \
\ \"acc_norm\": 0.5630252100840336,\n \"acc_norm_stderr\": 0.03221943636566195\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7559633027522936,\n \"acc_stderr\": 0.018415286351416416,\n \"\
acc_norm\": 0.7559633027522936,\n \"acc_norm_stderr\": 0.018415286351416416\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4537037037037037,\n \"acc_stderr\": 0.03395322726375797,\n \"\
acc_norm\": 0.4537037037037037,\n \"acc_norm_stderr\": 0.03395322726375797\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7401960784313726,\n \"acc_stderr\": 0.030778554678693264,\n \"\
acc_norm\": 0.7401960784313726,\n \"acc_norm_stderr\": 0.030778554678693264\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7341772151898734,\n \"acc_stderr\": 0.02875679962965834,\n \
\ \"acc_norm\": 0.7341772151898734,\n \"acc_norm_stderr\": 0.02875679962965834\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6547085201793722,\n\
\ \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.6547085201793722,\n\
\ \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6030534351145038,\n \"acc_stderr\": 0.04291135671009225,\n\
\ \"acc_norm\": 0.6030534351145038,\n \"acc_norm_stderr\": 0.04291135671009225\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7107438016528925,\n \"acc_stderr\": 0.041391127276354626,\n \"\
acc_norm\": 0.7107438016528925,\n \"acc_norm_stderr\": 0.041391127276354626\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\
\ \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n\
\ \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6687116564417178,\n \"acc_stderr\": 0.03697983910025588,\n\
\ \"acc_norm\": 0.6687116564417178,\n \"acc_norm_stderr\": 0.03697983910025588\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.043546310772605956,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.043546310772605956\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7991452991452992,\n\
\ \"acc_stderr\": 0.02624677294689048,\n \"acc_norm\": 0.7991452991452992,\n\
\ \"acc_norm_stderr\": 0.02624677294689048\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.55,\n \"acc_stderr\": 0.04999999999999999,\n \
\ \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.04999999999999999\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7420178799489144,\n\
\ \"acc_stderr\": 0.01564583018834895,\n \"acc_norm\": 0.7420178799489144,\n\
\ \"acc_norm_stderr\": 0.01564583018834895\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6416184971098265,\n \"acc_stderr\": 0.025816756791584187,\n\
\ \"acc_norm\": 0.6416184971098265,\n \"acc_norm_stderr\": 0.025816756791584187\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3474860335195531,\n\
\ \"acc_stderr\": 0.01592556406020815,\n \"acc_norm\": 0.3474860335195531,\n\
\ \"acc_norm_stderr\": 0.01592556406020815\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6274509803921569,\n \"acc_stderr\": 0.027684181883302898,\n\
\ \"acc_norm\": 0.6274509803921569,\n \"acc_norm_stderr\": 0.027684181883302898\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6655948553054662,\n\
\ \"acc_stderr\": 0.02679542232789394,\n \"acc_norm\": 0.6655948553054662,\n\
\ \"acc_norm_stderr\": 0.02679542232789394\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6388888888888888,\n \"acc_stderr\": 0.026725868809100793,\n\
\ \"acc_norm\": 0.6388888888888888,\n \"acc_norm_stderr\": 0.026725868809100793\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3900709219858156,\n \"acc_stderr\": 0.029097675599463926,\n \
\ \"acc_norm\": 0.3900709219858156,\n \"acc_norm_stderr\": 0.029097675599463926\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41199478487614083,\n\
\ \"acc_stderr\": 0.012570871032146073,\n \"acc_norm\": 0.41199478487614083,\n\
\ \"acc_norm_stderr\": 0.012570871032146073\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5367647058823529,\n \"acc_stderr\": 0.03029061918048569,\n\
\ \"acc_norm\": 0.5367647058823529,\n \"acc_norm_stderr\": 0.03029061918048569\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.545751633986928,\n \"acc_stderr\": 0.020142974553795205,\n \
\ \"acc_norm\": 0.545751633986928,\n \"acc_norm_stderr\": 0.020142974553795205\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5909090909090909,\n\
\ \"acc_stderr\": 0.04709306978661895,\n \"acc_norm\": 0.5909090909090909,\n\
\ \"acc_norm_stderr\": 0.04709306978661895\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6204081632653061,\n \"acc_stderr\": 0.03106721126287247,\n\
\ \"acc_norm\": 0.6204081632653061,\n \"acc_norm_stderr\": 0.03106721126287247\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7313432835820896,\n\
\ \"acc_stderr\": 0.03134328358208954,\n \"acc_norm\": 0.7313432835820896,\n\
\ \"acc_norm_stderr\": 0.03134328358208954\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42771084337349397,\n\
\ \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.42771084337349397,\n\
\ \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03188578017686398,\n\
\ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03188578017686398\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2607099143206854,\n\
\ \"mc1_stderr\": 0.015368841620766373,\n \"mc2\": 0.3803125479959119,\n\
\ \"mc2_stderr\": 0.013645567872439613\n }\n}\n```"
repo_url: https://huggingface.co/BramVanroy/llama2-13b-ft-mc4_nl_cleaned_tiny
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_03T18_14_29.012381
path:
- '**/details_harness|arc:challenge|25_2023-10-03T18-14-29.012381.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-03T18-14-29.012381.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_03T18_14_29.012381
path:
- '**/details_harness|hellaswag|10_2023-10-03T18-14-29.012381.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-03T18-14-29.012381.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_03T18_14_29.012381
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T18-14-29.012381.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T18-14-29.012381.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T18-14-29.012381.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_03T18_14_29.012381
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T18-14-29.012381.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T18-14-29.012381.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_03T18_14_29.012381
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T18-14-29.012381.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T18-14-29.012381.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_03T18_14_29.012381
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T18-14-29.012381.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T18-14-29.012381.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_03T18_14_29.012381
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T18-14-29.012381.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T18-14-29.012381.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_03T18_14_29.012381
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T18-14-29.012381.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T18-14-29.012381.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_03T18_14_29.012381
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T18-14-29.012381.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T18-14-29.012381.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_03T18_14_29.012381
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T18-14-29.012381.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T18-14-29.012381.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_03T18_14_29.012381
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T18-14-29.012381.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T18-14-29.012381.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_03T18_14_29.012381
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T18-14-29.012381.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T18-14-29.012381.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_03T18_14_29.012381
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T18-14-29.012381.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T18-14-29.012381.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_03T18_14_29.012381
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T18-14-29.012381.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T18-14-29.012381.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_03T18_14_29.012381
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T18-14-29.012381.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T18-14-29.012381.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_03T18_14_29.012381
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T18-14-29.012381.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T18-14-29.012381.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_03T18_14_29.012381
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T18-14-29.012381.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T18-14-29.012381.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_03T18_14_29.012381
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T18-14-29.012381.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T18-14-29.012381.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_03T18_14_29.012381
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T18-14-29.012381.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T18-14-29.012381.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_03T18_14_29.012381
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T18-14-29.012381.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T18-14-29.012381.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_03T18_14_29.012381
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T18-14-29.012381.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T18-14-29.012381.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_03T18_14_29.012381
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T18-14-29.012381.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T18-14-29.012381.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_03T18_14_29.012381
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T18-14-29.012381.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T18-14-29.012381.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_03T18_14_29.012381
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T18-14-29.012381.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T18-14-29.012381.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_03T18_14_29.012381
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T18-14-29.012381.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T18-14-29.012381.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_03T18_14_29.012381
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T18-14-29.012381.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T18-14-29.012381.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_03T18_14_29.012381
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T18-14-29.012381.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T18-14-29.012381.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_03T18_14_29.012381
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T18-14-29.012381.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T18-14-29.012381.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_03T18_14_29.012381
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T18-14-29.012381.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T18-14-29.012381.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_03T18_14_29.012381
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T18-14-29.012381.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T18-14-29.012381.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_03T18_14_29.012381
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T18-14-29.012381.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T18-14-29.012381.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_03T18_14_29.012381
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T18-14-29.012381.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T18-14-29.012381.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_03T18_14_29.012381
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T18-14-29.012381.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T18-14-29.012381.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_03T18_14_29.012381
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T18-14-29.012381.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T18-14-29.012381.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_03T18_14_29.012381
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T18-14-29.012381.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T18-14-29.012381.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_03T18_14_29.012381
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T18-14-29.012381.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T18-14-29.012381.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_03T18_14_29.012381
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T18-14-29.012381.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T18-14-29.012381.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_03T18_14_29.012381
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T18-14-29.012381.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T18-14-29.012381.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_03T18_14_29.012381
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T18-14-29.012381.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T18-14-29.012381.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_03T18_14_29.012381
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T18-14-29.012381.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T18-14-29.012381.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_03T18_14_29.012381
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T18-14-29.012381.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T18-14-29.012381.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_03T18_14_29.012381
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T18-14-29.012381.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T18-14-29.012381.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_03T18_14_29.012381
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T18-14-29.012381.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T18-14-29.012381.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_03T18_14_29.012381
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T18-14-29.012381.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T18-14-29.012381.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_03T18_14_29.012381
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T18-14-29.012381.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T18-14-29.012381.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_03T18_14_29.012381
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T18-14-29.012381.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T18-14-29.012381.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_03T18_14_29.012381
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T18-14-29.012381.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T18-14-29.012381.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_03T18_14_29.012381
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T18-14-29.012381.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T18-14-29.012381.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_03T18_14_29.012381
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T18-14-29.012381.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T18-14-29.012381.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_03T18_14_29.012381
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T18-14-29.012381.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T18-14-29.012381.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_03T18_14_29.012381
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T18-14-29.012381.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T18-14-29.012381.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_03T18_14_29.012381
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T18-14-29.012381.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T18-14-29.012381.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_03T18_14_29.012381
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T18-14-29.012381.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T18-14-29.012381.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_03T18_14_29.012381
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T18-14-29.012381.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T18-14-29.012381.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_03T18_14_29.012381
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T18-14-29.012381.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T18-14-29.012381.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_03T18_14_29.012381
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T18-14-29.012381.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T18-14-29.012381.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_03T18_14_29.012381
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T18-14-29.012381.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T18-14-29.012381.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_03T18_14_29.012381
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T18-14-29.012381.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T18-14-29.012381.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_03T18_14_29.012381
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T18-14-29.012381.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T18-14-29.012381.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_03T18_14_29.012381
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T18-14-29.012381.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T18-14-29.012381.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_03T18_14_29.012381
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T18-14-29.012381.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T18-14-29.012381.parquet'
- config_name: results
data_files:
- split: 2023_10_03T18_14_29.012381
path:
- results_2023-10-03T18-14-29.012381.parquet
- split: latest
path:
- results_2023-10-03T18-14-29.012381.parquet
---
# Dataset Card for Evaluation run of BramVanroy/llama2-13b-ft-mc4_nl_cleaned_tiny
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/BramVanroy/llama2-13b-ft-mc4_nl_cleaned_tiny
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [BramVanroy/llama2-13b-ft-mc4_nl_cleaned_tiny](https://huggingface.co/BramVanroy/llama2-13b-ft-mc4_nl_cleaned_tiny) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_BramVanroy__llama2-13b-ft-mc4_nl_cleaned_tiny",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-03T18:14:29.012381](https://huggingface.co/datasets/open-llm-leaderboard/details_BramVanroy__llama2-13b-ft-mc4_nl_cleaned_tiny/blob/main/results_2023-10-03T18-14-29.012381.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5478041931115556,
"acc_stderr": 0.034421246552364795,
"acc_norm": 0.552100869117797,
"acc_norm_stderr": 0.03440056394916386,
"mc1": 0.2607099143206854,
"mc1_stderr": 0.015368841620766373,
"mc2": 0.3803125479959119,
"mc2_stderr": 0.013645567872439613
},
"harness|arc:challenge|25": {
"acc": 0.5469283276450512,
"acc_stderr": 0.014546892052005628,
"acc_norm": 0.5930034129692833,
"acc_norm_stderr": 0.01435639941800912
},
"harness|hellaswag|10": {
"acc": 0.6129257120095598,
"acc_stderr": 0.004860854240821963,
"acc_norm": 0.8203545110535749,
"acc_norm_stderr": 0.0038310732859630774
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.04605661864718381,
"acc_norm": 0.3,
"acc_norm_stderr": 0.04605661864718381
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5394736842105263,
"acc_stderr": 0.04056242252249033,
"acc_norm": 0.5394736842105263,
"acc_norm_stderr": 0.04056242252249033
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6037735849056604,
"acc_stderr": 0.030102793781791194,
"acc_norm": 0.6037735849056604,
"acc_norm_stderr": 0.030102793781791194
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5486111111111112,
"acc_stderr": 0.04161402398403279,
"acc_norm": 0.5486111111111112,
"acc_norm_stderr": 0.04161402398403279
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4913294797687861,
"acc_stderr": 0.03811890988940412,
"acc_norm": 0.4913294797687861,
"acc_norm_stderr": 0.03811890988940412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.04280105837364395,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.04280105837364395
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.451063829787234,
"acc_stderr": 0.032529096196131965,
"acc_norm": 0.451063829787234,
"acc_norm_stderr": 0.032529096196131965
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.040969851398436716,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.040969851398436716
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4689655172413793,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.4689655172413793,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.31216931216931215,
"acc_stderr": 0.023865206836972606,
"acc_norm": 0.31216931216931215,
"acc_norm_stderr": 0.023865206836972606
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.04163453031302859,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.04163453031302859
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.667741935483871,
"acc_stderr": 0.0267955608481228,
"acc_norm": 0.667741935483871,
"acc_norm_stderr": 0.0267955608481228
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.41379310344827586,
"acc_stderr": 0.03465304488406796,
"acc_norm": 0.41379310344827586,
"acc_norm_stderr": 0.03465304488406796
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.037563357751878974,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.037563357751878974
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03358618145732522,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03358618145732522
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7875647668393783,
"acc_stderr": 0.029519282616817234,
"acc_norm": 0.7875647668393783,
"acc_norm_stderr": 0.029519282616817234
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5076923076923077,
"acc_stderr": 0.025348006031534778,
"acc_norm": 0.5076923076923077,
"acc_norm_stderr": 0.025348006031534778
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.026842057873833706,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.026842057873833706
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5630252100840336,
"acc_stderr": 0.03221943636566195,
"acc_norm": 0.5630252100840336,
"acc_norm_stderr": 0.03221943636566195
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7559633027522936,
"acc_stderr": 0.018415286351416416,
"acc_norm": 0.7559633027522936,
"acc_norm_stderr": 0.018415286351416416
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4537037037037037,
"acc_stderr": 0.03395322726375797,
"acc_norm": 0.4537037037037037,
"acc_norm_stderr": 0.03395322726375797
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7401960784313726,
"acc_stderr": 0.030778554678693264,
"acc_norm": 0.7401960784313726,
"acc_norm_stderr": 0.030778554678693264
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7341772151898734,
"acc_stderr": 0.02875679962965834,
"acc_norm": 0.7341772151898734,
"acc_norm_stderr": 0.02875679962965834
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6547085201793722,
"acc_stderr": 0.03191100192835794,
"acc_norm": 0.6547085201793722,
"acc_norm_stderr": 0.03191100192835794
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6030534351145038,
"acc_stderr": 0.04291135671009225,
"acc_norm": 0.6030534351145038,
"acc_norm_stderr": 0.04291135671009225
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7107438016528925,
"acc_stderr": 0.041391127276354626,
"acc_norm": 0.7107438016528925,
"acc_norm_stderr": 0.041391127276354626
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6687116564417178,
"acc_stderr": 0.03697983910025588,
"acc_norm": 0.6687116564417178,
"acc_norm_stderr": 0.03697983910025588
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.043546310772605956,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.043546310772605956
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7991452991452992,
"acc_stderr": 0.02624677294689048,
"acc_norm": 0.7991452991452992,
"acc_norm_stderr": 0.02624677294689048
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.55,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.55,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7420178799489144,
"acc_stderr": 0.01564583018834895,
"acc_norm": 0.7420178799489144,
"acc_norm_stderr": 0.01564583018834895
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.025816756791584187,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.025816756791584187
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3474860335195531,
"acc_stderr": 0.01592556406020815,
"acc_norm": 0.3474860335195531,
"acc_norm_stderr": 0.01592556406020815
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6274509803921569,
"acc_stderr": 0.027684181883302898,
"acc_norm": 0.6274509803921569,
"acc_norm_stderr": 0.027684181883302898
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6655948553054662,
"acc_stderr": 0.02679542232789394,
"acc_norm": 0.6655948553054662,
"acc_norm_stderr": 0.02679542232789394
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.026725868809100793,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.026725868809100793
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3900709219858156,
"acc_stderr": 0.029097675599463926,
"acc_norm": 0.3900709219858156,
"acc_norm_stderr": 0.029097675599463926
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.41199478487614083,
"acc_stderr": 0.012570871032146073,
"acc_norm": 0.41199478487614083,
"acc_norm_stderr": 0.012570871032146073
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5367647058823529,
"acc_stderr": 0.03029061918048569,
"acc_norm": 0.5367647058823529,
"acc_norm_stderr": 0.03029061918048569
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.545751633986928,
"acc_stderr": 0.020142974553795205,
"acc_norm": 0.545751633986928,
"acc_norm_stderr": 0.020142974553795205
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5909090909090909,
"acc_stderr": 0.04709306978661895,
"acc_norm": 0.5909090909090909,
"acc_norm_stderr": 0.04709306978661895
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6204081632653061,
"acc_stderr": 0.03106721126287247,
"acc_norm": 0.6204081632653061,
"acc_norm_stderr": 0.03106721126287247
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7313432835820896,
"acc_stderr": 0.03134328358208954,
"acc_norm": 0.7313432835820896,
"acc_norm_stderr": 0.03134328358208954
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42771084337349397,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.42771084337349397,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03188578017686398,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03188578017686398
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2607099143206854,
"mc1_stderr": 0.015368841620766373,
"mc2": 0.3803125479959119,
"mc2_stderr": 0.013645567872439613
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_haoranxu__ALMA-13B-Pretrain | 2023-10-03T18:17:53.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of haoranxu/ALMA-13B-Pretrain
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [haoranxu/ALMA-13B-Pretrain](https://huggingface.co/haoranxu/ALMA-13B-Pretrain)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_haoranxu__ALMA-13B-Pretrain\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-03T18:16:28.187729](https://huggingface.co/datasets/open-llm-leaderboard/details_haoranxu__ALMA-13B-Pretrain/blob/main/results_2023-10-03T18-16-28.187729.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5052324964508096,\n\
\ \"acc_stderr\": 0.034997734866206914,\n \"acc_norm\": 0.5092587342822293,\n\
\ \"acc_norm_stderr\": 0.034980509895033284,\n \"mc1\": 0.24969400244798043,\n\
\ \"mc1_stderr\": 0.015152286907148125,\n \"mc2\": 0.3744273268175779,\n\
\ \"mc2_stderr\": 0.013679328267583054\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.53839590443686,\n \"acc_stderr\": 0.014568245550296356,\n\
\ \"acc_norm\": 0.5691126279863481,\n \"acc_norm_stderr\": 0.014471133392642463\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5947022505476997,\n\
\ \"acc_stderr\": 0.004899462111832331,\n \"acc_norm\": 0.8015335590519816,\n\
\ \"acc_norm_stderr\": 0.003980300970241414\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5703703703703704,\n\
\ \"acc_stderr\": 0.04276349494376599,\n \"acc_norm\": 0.5703703703703704,\n\
\ \"acc_norm_stderr\": 0.04276349494376599\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04068942293855797,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04068942293855797\n },\n\
\ \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n \
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5358490566037736,\n \"acc_stderr\": 0.030693675018458006,\n\
\ \"acc_norm\": 0.5358490566037736,\n \"acc_norm_stderr\": 0.030693675018458006\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5277777777777778,\n\
\ \"acc_stderr\": 0.04174752578923183,\n \"acc_norm\": 0.5277777777777778,\n\
\ \"acc_norm_stderr\": 0.04174752578923183\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n\
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4508670520231214,\n\
\ \"acc_stderr\": 0.03794012674697029,\n \"acc_norm\": 0.4508670520231214,\n\
\ \"acc_norm_stderr\": 0.03794012674697029\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.03202563076101735,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.03202563076101735\n },\n\
\ \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3157894736842105,\n\
\ \"acc_stderr\": 0.04372748290278007,\n \"acc_norm\": 0.3157894736842105,\n\
\ \"acc_norm_stderr\": 0.04372748290278007\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.43448275862068964,\n \"acc_stderr\": 0.04130740879555497,\n\
\ \"acc_norm\": 0.43448275862068964,\n \"acc_norm_stderr\": 0.04130740879555497\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3412698412698413,\n \"acc_stderr\": 0.024419234966819078,\n \"\
acc_norm\": 0.3412698412698413,\n \"acc_norm_stderr\": 0.024419234966819078\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.25396825396825395,\n\
\ \"acc_stderr\": 0.038932596106046734,\n \"acc_norm\": 0.25396825396825395,\n\
\ \"acc_norm_stderr\": 0.038932596106046734\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.5806451612903226,\n \"acc_stderr\": 0.028071588901091845,\n \"\
acc_norm\": 0.5806451612903226,\n \"acc_norm_stderr\": 0.028071588901091845\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4039408866995074,\n \"acc_stderr\": 0.0345245390382204,\n \"acc_norm\"\
: 0.4039408866995074,\n \"acc_norm_stderr\": 0.0345245390382204\n },\n\
\ \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\"\
: 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n\
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6181818181818182,\n \"acc_stderr\": 0.037937131711656344,\n\
\ \"acc_norm\": 0.6181818181818182,\n \"acc_norm_stderr\": 0.037937131711656344\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6212121212121212,\n \"acc_stderr\": 0.03456088731993747,\n \"\
acc_norm\": 0.6212121212121212,\n \"acc_norm_stderr\": 0.03456088731993747\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7046632124352331,\n \"acc_stderr\": 0.03292296639155141,\n\
\ \"acc_norm\": 0.7046632124352331,\n \"acc_norm_stderr\": 0.03292296639155141\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4717948717948718,\n \"acc_stderr\": 0.0253106392549339,\n \
\ \"acc_norm\": 0.4717948717948718,\n \"acc_norm_stderr\": 0.0253106392549339\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2851851851851852,\n \"acc_stderr\": 0.02752859921034049,\n \
\ \"acc_norm\": 0.2851851851851852,\n \"acc_norm_stderr\": 0.02752859921034049\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5042016806722689,\n \"acc_stderr\": 0.03247734334448111,\n \
\ \"acc_norm\": 0.5042016806722689,\n \"acc_norm_stderr\": 0.03247734334448111\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526733,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526733\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6477064220183486,\n \"acc_stderr\": 0.020480568843998986,\n \"\
acc_norm\": 0.6477064220183486,\n \"acc_norm_stderr\": 0.020480568843998986\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.37962962962962965,\n \"acc_stderr\": 0.03309682581119035,\n \"\
acc_norm\": 0.37962962962962965,\n \"acc_norm_stderr\": 0.03309682581119035\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6421568627450981,\n \"acc_stderr\": 0.03364487286088299,\n \"\
acc_norm\": 0.6421568627450981,\n \"acc_norm_stderr\": 0.03364487286088299\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6413502109704642,\n \"acc_stderr\": 0.031219569445301843,\n \
\ \"acc_norm\": 0.6413502109704642,\n \"acc_norm_stderr\": 0.031219569445301843\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6322869955156951,\n\
\ \"acc_stderr\": 0.03236198350928275,\n \"acc_norm\": 0.6322869955156951,\n\
\ \"acc_norm_stderr\": 0.03236198350928275\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5801526717557252,\n \"acc_stderr\": 0.043285772152629715,\n\
\ \"acc_norm\": 0.5801526717557252,\n \"acc_norm_stderr\": 0.043285772152629715\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7272727272727273,\n \"acc_stderr\": 0.04065578140908706,\n \"\
acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04065578140908706\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6388888888888888,\n\
\ \"acc_stderr\": 0.04643454608906275,\n \"acc_norm\": 0.6388888888888888,\n\
\ \"acc_norm_stderr\": 0.04643454608906275\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5705521472392638,\n \"acc_stderr\": 0.038890666191127216,\n\
\ \"acc_norm\": 0.5705521472392638,\n \"acc_norm_stderr\": 0.038890666191127216\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3482142857142857,\n\
\ \"acc_stderr\": 0.04521829902833586,\n \"acc_norm\": 0.3482142857142857,\n\
\ \"acc_norm_stderr\": 0.04521829902833586\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6019417475728155,\n \"acc_stderr\": 0.0484674825397724,\n\
\ \"acc_norm\": 0.6019417475728155,\n \"acc_norm_stderr\": 0.0484674825397724\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6794871794871795,\n\
\ \"acc_stderr\": 0.03057281131029961,\n \"acc_norm\": 0.6794871794871795,\n\
\ \"acc_norm_stderr\": 0.03057281131029961\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6794380587484036,\n\
\ \"acc_stderr\": 0.016688893310803782,\n \"acc_norm\": 0.6794380587484036,\n\
\ \"acc_norm_stderr\": 0.016688893310803782\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5982658959537572,\n \"acc_stderr\": 0.026394104177643634,\n\
\ \"acc_norm\": 0.5982658959537572,\n \"acc_norm_stderr\": 0.026394104177643634\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n\
\ \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n\
\ \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5522875816993464,\n \"acc_stderr\": 0.02847293847803353,\n\
\ \"acc_norm\": 0.5522875816993464,\n \"acc_norm_stderr\": 0.02847293847803353\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6205787781350482,\n\
\ \"acc_stderr\": 0.02755994980234782,\n \"acc_norm\": 0.6205787781350482,\n\
\ \"acc_norm_stderr\": 0.02755994980234782\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5709876543209876,\n \"acc_stderr\": 0.027538925613470863,\n\
\ \"acc_norm\": 0.5709876543209876,\n \"acc_norm_stderr\": 0.027538925613470863\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4078014184397163,\n \"acc_stderr\": 0.029316011776343555,\n \
\ \"acc_norm\": 0.4078014184397163,\n \"acc_norm_stderr\": 0.029316011776343555\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.378748370273794,\n\
\ \"acc_stderr\": 0.012389052105003727,\n \"acc_norm\": 0.378748370273794,\n\
\ \"acc_norm_stderr\": 0.012389052105003727\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.44485294117647056,\n \"acc_stderr\": 0.030187532060329387,\n\
\ \"acc_norm\": 0.44485294117647056,\n \"acc_norm_stderr\": 0.030187532060329387\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5130718954248366,\n \"acc_stderr\": 0.020220920829626912,\n \
\ \"acc_norm\": 0.5130718954248366,\n \"acc_norm_stderr\": 0.020220920829626912\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5272727272727272,\n\
\ \"acc_stderr\": 0.04782001791380061,\n \"acc_norm\": 0.5272727272727272,\n\
\ \"acc_norm_stderr\": 0.04782001791380061\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5714285714285714,\n \"acc_stderr\": 0.031680911612338825,\n\
\ \"acc_norm\": 0.5714285714285714,\n \"acc_norm_stderr\": 0.031680911612338825\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6766169154228856,\n\
\ \"acc_stderr\": 0.03307615947979035,\n \"acc_norm\": 0.6766169154228856,\n\
\ \"acc_norm_stderr\": 0.03307615947979035\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4397590361445783,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.4397590361445783,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7309941520467836,\n \"acc_stderr\": 0.03401052620104089,\n\
\ \"acc_norm\": 0.7309941520467836,\n \"acc_norm_stderr\": 0.03401052620104089\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.24969400244798043,\n\
\ \"mc1_stderr\": 0.015152286907148125,\n \"mc2\": 0.3744273268175779,\n\
\ \"mc2_stderr\": 0.013679328267583054\n }\n}\n```"
repo_url: https://huggingface.co/haoranxu/ALMA-13B-Pretrain
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|arc:challenge|25_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hellaswag|10_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T18-16-28.187729.parquet'
- config_name: results
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- results_2023-10-03T18-16-28.187729.parquet
- split: latest
path:
- results_2023-10-03T18-16-28.187729.parquet
---
# Dataset Card for Evaluation run of haoranxu/ALMA-13B-Pretrain
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/haoranxu/ALMA-13B-Pretrain
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [haoranxu/ALMA-13B-Pretrain](https://huggingface.co/haoranxu/ALMA-13B-Pretrain) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_haoranxu__ALMA-13B-Pretrain",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-03T18:16:28.187729](https://huggingface.co/datasets/open-llm-leaderboard/details_haoranxu__ALMA-13B-Pretrain/blob/main/results_2023-10-03T18-16-28.187729.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5052324964508096,
"acc_stderr": 0.034997734866206914,
"acc_norm": 0.5092587342822293,
"acc_norm_stderr": 0.034980509895033284,
"mc1": 0.24969400244798043,
"mc1_stderr": 0.015152286907148125,
"mc2": 0.3744273268175779,
"mc2_stderr": 0.013679328267583054
},
"harness|arc:challenge|25": {
"acc": 0.53839590443686,
"acc_stderr": 0.014568245550296356,
"acc_norm": 0.5691126279863481,
"acc_norm_stderr": 0.014471133392642463
},
"harness|hellaswag|10": {
"acc": 0.5947022505476997,
"acc_stderr": 0.004899462111832331,
"acc_norm": 0.8015335590519816,
"acc_norm_stderr": 0.003980300970241414
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5703703703703704,
"acc_stderr": 0.04276349494376599,
"acc_norm": 0.5703703703703704,
"acc_norm_stderr": 0.04276349494376599
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5,
"acc_stderr": 0.04068942293855797,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04068942293855797
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5358490566037736,
"acc_stderr": 0.030693675018458006,
"acc_norm": 0.5358490566037736,
"acc_norm_stderr": 0.030693675018458006
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.04174752578923183,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.04174752578923183
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4508670520231214,
"acc_stderr": 0.03794012674697029,
"acc_norm": 0.4508670520231214,
"acc_norm_stderr": 0.03794012674697029
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4,
"acc_stderr": 0.03202563076101735,
"acc_norm": 0.4,
"acc_norm_stderr": 0.03202563076101735
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.04372748290278007,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.04372748290278007
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.43448275862068964,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.43448275862068964,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.024419234966819078,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.024419234966819078
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.038932596106046734,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.038932596106046734
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5806451612903226,
"acc_stderr": 0.028071588901091845,
"acc_norm": 0.5806451612903226,
"acc_norm_stderr": 0.028071588901091845
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4039408866995074,
"acc_stderr": 0.0345245390382204,
"acc_norm": 0.4039408866995074,
"acc_norm_stderr": 0.0345245390382204
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.037937131711656344,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.037937131711656344
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6212121212121212,
"acc_stderr": 0.03456088731993747,
"acc_norm": 0.6212121212121212,
"acc_norm_stderr": 0.03456088731993747
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7046632124352331,
"acc_stderr": 0.03292296639155141,
"acc_norm": 0.7046632124352331,
"acc_norm_stderr": 0.03292296639155141
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4717948717948718,
"acc_stderr": 0.0253106392549339,
"acc_norm": 0.4717948717948718,
"acc_norm_stderr": 0.0253106392549339
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2851851851851852,
"acc_stderr": 0.02752859921034049,
"acc_norm": 0.2851851851851852,
"acc_norm_stderr": 0.02752859921034049
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5042016806722689,
"acc_stderr": 0.03247734334448111,
"acc_norm": 0.5042016806722689,
"acc_norm_stderr": 0.03247734334448111
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526733,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526733
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6477064220183486,
"acc_stderr": 0.020480568843998986,
"acc_norm": 0.6477064220183486,
"acc_norm_stderr": 0.020480568843998986
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.37962962962962965,
"acc_stderr": 0.03309682581119035,
"acc_norm": 0.37962962962962965,
"acc_norm_stderr": 0.03309682581119035
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6421568627450981,
"acc_stderr": 0.03364487286088299,
"acc_norm": 0.6421568627450981,
"acc_norm_stderr": 0.03364487286088299
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6413502109704642,
"acc_stderr": 0.031219569445301843,
"acc_norm": 0.6413502109704642,
"acc_norm_stderr": 0.031219569445301843
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6322869955156951,
"acc_stderr": 0.03236198350928275,
"acc_norm": 0.6322869955156951,
"acc_norm_stderr": 0.03236198350928275
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5801526717557252,
"acc_stderr": 0.043285772152629715,
"acc_norm": 0.5801526717557252,
"acc_norm_stderr": 0.043285772152629715
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04065578140908706,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04065578140908706
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.04643454608906275,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.04643454608906275
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5705521472392638,
"acc_stderr": 0.038890666191127216,
"acc_norm": 0.5705521472392638,
"acc_norm_stderr": 0.038890666191127216
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3482142857142857,
"acc_stderr": 0.04521829902833586,
"acc_norm": 0.3482142857142857,
"acc_norm_stderr": 0.04521829902833586
},
"harness|hendrycksTest-management|5": {
"acc": 0.6019417475728155,
"acc_stderr": 0.0484674825397724,
"acc_norm": 0.6019417475728155,
"acc_norm_stderr": 0.0484674825397724
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6794871794871795,
"acc_stderr": 0.03057281131029961,
"acc_norm": 0.6794871794871795,
"acc_norm_stderr": 0.03057281131029961
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6794380587484036,
"acc_stderr": 0.016688893310803782,
"acc_norm": 0.6794380587484036,
"acc_norm_stderr": 0.016688893310803782
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5982658959537572,
"acc_stderr": 0.026394104177643634,
"acc_norm": 0.5982658959537572,
"acc_norm_stderr": 0.026394104177643634
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808835,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808835
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5522875816993464,
"acc_stderr": 0.02847293847803353,
"acc_norm": 0.5522875816993464,
"acc_norm_stderr": 0.02847293847803353
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6205787781350482,
"acc_stderr": 0.02755994980234782,
"acc_norm": 0.6205787781350482,
"acc_norm_stderr": 0.02755994980234782
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5709876543209876,
"acc_stderr": 0.027538925613470863,
"acc_norm": 0.5709876543209876,
"acc_norm_stderr": 0.027538925613470863
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4078014184397163,
"acc_stderr": 0.029316011776343555,
"acc_norm": 0.4078014184397163,
"acc_norm_stderr": 0.029316011776343555
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.378748370273794,
"acc_stderr": 0.012389052105003727,
"acc_norm": 0.378748370273794,
"acc_norm_stderr": 0.012389052105003727
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.44485294117647056,
"acc_stderr": 0.030187532060329387,
"acc_norm": 0.44485294117647056,
"acc_norm_stderr": 0.030187532060329387
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5130718954248366,
"acc_stderr": 0.020220920829626912,
"acc_norm": 0.5130718954248366,
"acc_norm_stderr": 0.020220920829626912
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5272727272727272,
"acc_stderr": 0.04782001791380061,
"acc_norm": 0.5272727272727272,
"acc_norm_stderr": 0.04782001791380061
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.031680911612338825,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.031680911612338825
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6766169154228856,
"acc_stderr": 0.03307615947979035,
"acc_norm": 0.6766169154228856,
"acc_norm_stderr": 0.03307615947979035
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4397590361445783,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.4397590361445783,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7309941520467836,
"acc_stderr": 0.03401052620104089,
"acc_norm": 0.7309941520467836,
"acc_norm_stderr": 0.03401052620104089
},
"harness|truthfulqa:mc|0": {
"mc1": 0.24969400244798043,
"mc1_stderr": 0.015152286907148125,
"mc2": 0.3744273268175779,
"mc2_stderr": 0.013679328267583054
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
BangumiBase/monogatariseries | 2023-10-03T23:22:55.000Z | [
"size_categories:1K<n<10K",
"license:mit",
"art",
"region:us"
] | BangumiBase | null | null | null | 0 | 0 | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Monogatari Series
This is the image base of bangumi Monogatari Series, we detected 66 characters, 8964 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 2206 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 64 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 82 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 163 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 180 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 106 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 354 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 63 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 166 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 121 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 545 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 302 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 92 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 399 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 170 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 86 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 126 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 25 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 289 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 39 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 52 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 57 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 24 | [Download](22/dataset.zip) |  |  |  |  |  |  |  |  |
| 23 | 275 | [Download](23/dataset.zip) |  |  |  |  |  |  |  |  |
| 24 | 48 | [Download](24/dataset.zip) |  |  |  |  |  |  |  |  |
| 25 | 77 | [Download](25/dataset.zip) |  |  |  |  |  |  |  |  |
| 26 | 96 | [Download](26/dataset.zip) |  |  |  |  |  |  |  |  |
| 27 | 50 | [Download](27/dataset.zip) |  |  |  |  |  |  |  |  |
| 28 | 41 | [Download](28/dataset.zip) |  |  |  |  |  |  |  |  |
| 29 | 99 | [Download](29/dataset.zip) |  |  |  |  |  |  |  |  |
| 30 | 22 | [Download](30/dataset.zip) |  |  |  |  |  |  |  |  |
| 31 | 37 | [Download](31/dataset.zip) |  |  |  |  |  |  |  |  |
| 32 | 282 | [Download](32/dataset.zip) |  |  |  |  |  |  |  |  |
| 33 | 66 | [Download](33/dataset.zip) |  |  |  |  |  |  |  |  |
| 34 | 61 | [Download](34/dataset.zip) |  |  |  |  |  |  |  |  |
| 35 | 26 | [Download](35/dataset.zip) |  |  |  |  |  |  |  |  |
| 36 | 18 | [Download](36/dataset.zip) |  |  |  |  |  |  |  |  |
| 37 | 158 | [Download](37/dataset.zip) |  |  |  |  |  |  |  |  |
| 38 | 431 | [Download](38/dataset.zip) |  |  |  |  |  |  |  |  |
| 39 | 25 | [Download](39/dataset.zip) |  |  |  |  |  |  |  |  |
| 40 | 23 | [Download](40/dataset.zip) |  |  |  |  |  |  |  |  |
| 41 | 19 | [Download](41/dataset.zip) |  |  |  |  |  |  |  |  |
| 42 | 35 | [Download](42/dataset.zip) |  |  |  |  |  |  |  |  |
| 43 | 11 | [Download](43/dataset.zip) |  |  |  |  |  |  |  |  |
| 44 | 10 | [Download](44/dataset.zip) |  |  |  |  |  |  |  |  |
| 45 | 18 | [Download](45/dataset.zip) |  |  |  |  |  |  |  |  |
| 46 | 21 | [Download](46/dataset.zip) |  |  |  |  |  |  |  |  |
| 47 | 447 | [Download](47/dataset.zip) |  |  |  |  |  |  |  |  |
| 48 | 38 | [Download](48/dataset.zip) |  |  |  |  |  |  |  |  |
| 49 | 53 | [Download](49/dataset.zip) |  |  |  |  |  |  |  |  |
| 50 | 48 | [Download](50/dataset.zip) |  |  |  |  |  |  |  |  |
| 51 | 33 | [Download](51/dataset.zip) |  |  |  |  |  |  |  |  |
| 52 | 78 | [Download](52/dataset.zip) |  |  |  |  |  |  |  |  |
| 53 | 8 | [Download](53/dataset.zip) |  |  |  |  |  |  |  |  |
| 54 | 25 | [Download](54/dataset.zip) |  |  |  |  |  |  |  |  |
| 55 | 100 | [Download](55/dataset.zip) |  |  |  |  |  |  |  |  |
| 56 | 42 | [Download](56/dataset.zip) |  |  |  |  |  |  |  |  |
| 57 | 12 | [Download](57/dataset.zip) |  |  |  |  |  |  |  |  |
| 58 | 13 | [Download](58/dataset.zip) |  |  |  |  |  |  |  |  |
| 59 | 6 | [Download](59/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 60 | 11 | [Download](60/dataset.zip) |  |  |  |  |  |  |  |  |
| 61 | 41 | [Download](61/dataset.zip) |  |  |  |  |  |  |  |  |
| 62 | 12 | [Download](62/dataset.zip) |  |  |  |  |  |  |  |  |
| 63 | 7 | [Download](63/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 64 | 8 | [Download](64/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 322 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
goendalf666/sales-conversations-instruction-ext | 2023-10-03T18:20:18.000Z | [
"region:us"
] | goendalf666 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: '0'
dtype: string
splits:
- name: train
num_bytes: 28036745
num_examples: 20940
download_size: 4782593
dataset_size: 28036745
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "sales-conversations-instruction_ext"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Globaly/categorias-180k-test | 2023-10-03T18:28:09.000Z | [
"region:us"
] | Globaly | null | null | null | 0 | 0 | Entry not found |
seansullivan/hugg-rob | 2023-10-03T18:34:15.000Z | [
"license:other",
"region:us"
] | seansullivan | null | null | null | 0 | 0 | ---
license: other
license_name: ss
license_link: LICENSE
---
|
nmsdvid/billy-website-copy | 2023-10-03T18:34:01.000Z | [
"license:mit",
"region:us"
] | nmsdvid | null | null | null | 0 | 0 | ---
license: mit
---
|
open-llm-leaderboard/details_Undi95__U-Amethyst-20B | 2023-10-03T18:45:33.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Undi95/U-Amethyst-20B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Undi95/U-Amethyst-20B](https://huggingface.co/Undi95/U-Amethyst-20B) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Undi95__U-Amethyst-20B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-03T18:44:08.205769](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__U-Amethyst-20B/blob/main/results_2023-10-03T18-44-08.205769.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5606420063064564,\n\
\ \"acc_stderr\": 0.034398732723866246,\n \"acc_norm\": 0.5644594145062996,\n\
\ \"acc_norm_stderr\": 0.034377283403392314,\n \"mc1\": 0.3598531211750306,\n\
\ \"mc1_stderr\": 0.016801860466677143,\n \"mc2\": 0.5320122237340842,\n\
\ \"mc2_stderr\": 0.015624089171491088\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5878839590443686,\n \"acc_stderr\": 0.014383915302225403,\n\
\ \"acc_norm\": 0.6220136518771331,\n \"acc_norm_stderr\": 0.014169664520303101\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6400119498107947,\n\
\ \"acc_stderr\": 0.004790155370993448,\n \"acc_norm\": 0.8311093407687712,\n\
\ \"acc_norm_stderr\": 0.0037388962449538144\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.049020713000019756,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.049020713000019756\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5526315789473685,\n \"acc_stderr\": 0.04046336883978251,\n\
\ \"acc_norm\": 0.5526315789473685,\n \"acc_norm_stderr\": 0.04046336883978251\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6037735849056604,\n \"acc_stderr\": 0.030102793781791197,\n\
\ \"acc_norm\": 0.6037735849056604,\n \"acc_norm_stderr\": 0.030102793781791197\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6180555555555556,\n\
\ \"acc_stderr\": 0.040629907841466674,\n \"acc_norm\": 0.6180555555555556,\n\
\ \"acc_norm_stderr\": 0.040629907841466674\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n\
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5202312138728323,\n\
\ \"acc_stderr\": 0.03809342081273957,\n \"acc_norm\": 0.5202312138728323,\n\
\ \"acc_norm_stderr\": 0.03809342081273957\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.046550104113196156,\n\
\ \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.046550104113196156\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.46808510638297873,\n \"acc_stderr\": 0.03261936918467382,\n\
\ \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.03261936918467382\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n\
\ \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n\
\ \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4896551724137931,\n \"acc_stderr\": 0.04165774775728763,\n\
\ \"acc_norm\": 0.4896551724137931,\n \"acc_norm_stderr\": 0.04165774775728763\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3201058201058201,\n \"acc_stderr\": 0.024026846392873502,\n \"\
acc_norm\": 0.3201058201058201,\n \"acc_norm_stderr\": 0.024026846392873502\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.0404061017820884,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.0404061017820884\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6548387096774193,\n\
\ \"acc_stderr\": 0.02704574657353433,\n \"acc_norm\": 0.6548387096774193,\n\
\ \"acc_norm_stderr\": 0.02704574657353433\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.458128078817734,\n \"acc_stderr\": 0.03505630140785741,\n\
\ \"acc_norm\": 0.458128078817734,\n \"acc_norm_stderr\": 0.03505630140785741\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.03663974994391244,\n\
\ \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.03663974994391244\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7222222222222222,\n \"acc_stderr\": 0.03191178226713547,\n \"\
acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.03191178226713547\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8134715025906736,\n \"acc_stderr\": 0.02811209121011748,\n\
\ \"acc_norm\": 0.8134715025906736,\n \"acc_norm_stderr\": 0.02811209121011748\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5666666666666667,\n \"acc_stderr\": 0.025124653525885117,\n\
\ \"acc_norm\": 0.5666666666666667,\n \"acc_norm_stderr\": 0.025124653525885117\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085622,\n \
\ \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085622\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.03156663099215416,\n \
\ \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.03156663099215416\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.03879687024073327,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.03879687024073327\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7137614678899082,\n \"acc_stderr\": 0.01937943662891998,\n \"\
acc_norm\": 0.7137614678899082,\n \"acc_norm_stderr\": 0.01937943662891998\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4444444444444444,\n \"acc_stderr\": 0.03388857118502326,\n \"\
acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.03388857118502326\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7696078431372549,\n \"acc_stderr\": 0.029554292605695066,\n \"\
acc_norm\": 0.7696078431372549,\n \"acc_norm_stderr\": 0.029554292605695066\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7341772151898734,\n \"acc_stderr\": 0.028756799629658346,\n \
\ \"acc_norm\": 0.7341772151898734,\n \"acc_norm_stderr\": 0.028756799629658346\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6335877862595419,\n \"acc_stderr\": 0.042258754519696365,\n\
\ \"acc_norm\": 0.6335877862595419,\n \"acc_norm_stderr\": 0.042258754519696365\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516304,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516304\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7037037037037037,\n\
\ \"acc_stderr\": 0.044143436668549335,\n \"acc_norm\": 0.7037037037037037,\n\
\ \"acc_norm_stderr\": 0.044143436668549335\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6687116564417178,\n \"acc_stderr\": 0.03697983910025588,\n\
\ \"acc_norm\": 0.6687116564417178,\n \"acc_norm_stderr\": 0.03697983910025588\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n\
\ \"acc_stderr\": 0.043270409325787296,\n \"acc_norm\": 0.29464285714285715,\n\
\ \"acc_norm_stderr\": 0.043270409325787296\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6601941747572816,\n \"acc_stderr\": 0.046897659372781335,\n\
\ \"acc_norm\": 0.6601941747572816,\n \"acc_norm_stderr\": 0.046897659372781335\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8076923076923077,\n\
\ \"acc_stderr\": 0.025819233256483706,\n \"acc_norm\": 0.8076923076923077,\n\
\ \"acc_norm_stderr\": 0.025819233256483706\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7266922094508301,\n\
\ \"acc_stderr\": 0.015936681062628556,\n \"acc_norm\": 0.7266922094508301,\n\
\ \"acc_norm_stderr\": 0.015936681062628556\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6502890173410405,\n \"acc_stderr\": 0.02567428145653102,\n\
\ \"acc_norm\": 0.6502890173410405,\n \"acc_norm_stderr\": 0.02567428145653102\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3206703910614525,\n\
\ \"acc_stderr\": 0.0156099295593484,\n \"acc_norm\": 0.3206703910614525,\n\
\ \"acc_norm_stderr\": 0.0156099295593484\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6568627450980392,\n \"acc_stderr\": 0.02718449890994162,\n\
\ \"acc_norm\": 0.6568627450980392,\n \"acc_norm_stderr\": 0.02718449890994162\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.662379421221865,\n\
\ \"acc_stderr\": 0.026858825879488544,\n \"acc_norm\": 0.662379421221865,\n\
\ \"acc_norm_stderr\": 0.026858825879488544\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6728395061728395,\n \"acc_stderr\": 0.026105673861409814,\n\
\ \"acc_norm\": 0.6728395061728395,\n \"acc_norm_stderr\": 0.026105673861409814\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.425531914893617,\n \"acc_stderr\": 0.02949482760014437,\n \
\ \"acc_norm\": 0.425531914893617,\n \"acc_norm_stderr\": 0.02949482760014437\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4445893089960887,\n\
\ \"acc_stderr\": 0.012691575792657114,\n \"acc_norm\": 0.4445893089960887,\n\
\ \"acc_norm_stderr\": 0.012691575792657114\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5477941176470589,\n \"acc_stderr\": 0.03023375855159644,\n\
\ \"acc_norm\": 0.5477941176470589,\n \"acc_norm_stderr\": 0.03023375855159644\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5784313725490197,\n \"acc_stderr\": 0.019977422600227474,\n \
\ \"acc_norm\": 0.5784313725490197,\n \"acc_norm_stderr\": 0.019977422600227474\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5818181818181818,\n\
\ \"acc_stderr\": 0.04724577405731572,\n \"acc_norm\": 0.5818181818181818,\n\
\ \"acc_norm_stderr\": 0.04724577405731572\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6204081632653061,\n \"acc_stderr\": 0.03106721126287247,\n\
\ \"acc_norm\": 0.6204081632653061,\n \"acc_norm_stderr\": 0.03106721126287247\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7263681592039801,\n\
\ \"acc_stderr\": 0.03152439186555402,\n \"acc_norm\": 0.7263681592039801,\n\
\ \"acc_norm_stderr\": 0.03152439186555402\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653694,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653694\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42771084337349397,\n\
\ \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.42771084337349397,\n\
\ \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7485380116959064,\n \"acc_stderr\": 0.033275044238468436,\n\
\ \"acc_norm\": 0.7485380116959064,\n \"acc_norm_stderr\": 0.033275044238468436\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3598531211750306,\n\
\ \"mc1_stderr\": 0.016801860466677143,\n \"mc2\": 0.5320122237340842,\n\
\ \"mc2_stderr\": 0.015624089171491088\n }\n}\n```"
repo_url: https://huggingface.co/Undi95/U-Amethyst-20B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_03T18_44_08.205769
path:
- '**/details_harness|arc:challenge|25_2023-10-03T18-44-08.205769.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-03T18-44-08.205769.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_03T18_44_08.205769
path:
- '**/details_harness|hellaswag|10_2023-10-03T18-44-08.205769.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-03T18-44-08.205769.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_03T18_44_08.205769
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T18-44-08.205769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T18-44-08.205769.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T18-44-08.205769.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_03T18_44_08.205769
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T18-44-08.205769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T18-44-08.205769.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_03T18_44_08.205769
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T18-44-08.205769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T18-44-08.205769.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_03T18_44_08.205769
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T18-44-08.205769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T18-44-08.205769.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_03T18_44_08.205769
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T18-44-08.205769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T18-44-08.205769.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_03T18_44_08.205769
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T18-44-08.205769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T18-44-08.205769.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_03T18_44_08.205769
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T18-44-08.205769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T18-44-08.205769.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_03T18_44_08.205769
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T18-44-08.205769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T18-44-08.205769.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_03T18_44_08.205769
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T18-44-08.205769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T18-44-08.205769.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_03T18_44_08.205769
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T18-44-08.205769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T18-44-08.205769.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_03T18_44_08.205769
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T18-44-08.205769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T18-44-08.205769.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_03T18_44_08.205769
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T18-44-08.205769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T18-44-08.205769.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_03T18_44_08.205769
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T18-44-08.205769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T18-44-08.205769.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_03T18_44_08.205769
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T18-44-08.205769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T18-44-08.205769.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_03T18_44_08.205769
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T18-44-08.205769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T18-44-08.205769.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_03T18_44_08.205769
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T18-44-08.205769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T18-44-08.205769.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_03T18_44_08.205769
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T18-44-08.205769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T18-44-08.205769.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_03T18_44_08.205769
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T18-44-08.205769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T18-44-08.205769.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_03T18_44_08.205769
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T18-44-08.205769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T18-44-08.205769.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_03T18_44_08.205769
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T18-44-08.205769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T18-44-08.205769.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_03T18_44_08.205769
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T18-44-08.205769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T18-44-08.205769.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_03T18_44_08.205769
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T18-44-08.205769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T18-44-08.205769.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_03T18_44_08.205769
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T18-44-08.205769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T18-44-08.205769.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_03T18_44_08.205769
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T18-44-08.205769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T18-44-08.205769.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_03T18_44_08.205769
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T18-44-08.205769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T18-44-08.205769.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_03T18_44_08.205769
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T18-44-08.205769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T18-44-08.205769.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_03T18_44_08.205769
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T18-44-08.205769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T18-44-08.205769.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_03T18_44_08.205769
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T18-44-08.205769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T18-44-08.205769.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_03T18_44_08.205769
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T18-44-08.205769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T18-44-08.205769.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_03T18_44_08.205769
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T18-44-08.205769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T18-44-08.205769.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_03T18_44_08.205769
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T18-44-08.205769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T18-44-08.205769.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_03T18_44_08.205769
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T18-44-08.205769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T18-44-08.205769.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_03T18_44_08.205769
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T18-44-08.205769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T18-44-08.205769.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_03T18_44_08.205769
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T18-44-08.205769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T18-44-08.205769.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_03T18_44_08.205769
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T18-44-08.205769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T18-44-08.205769.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_03T18_44_08.205769
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T18-44-08.205769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T18-44-08.205769.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_03T18_44_08.205769
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T18-44-08.205769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T18-44-08.205769.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_03T18_44_08.205769
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T18-44-08.205769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T18-44-08.205769.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_03T18_44_08.205769
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T18-44-08.205769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T18-44-08.205769.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_03T18_44_08.205769
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T18-44-08.205769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T18-44-08.205769.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_03T18_44_08.205769
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T18-44-08.205769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T18-44-08.205769.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_03T18_44_08.205769
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T18-44-08.205769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T18-44-08.205769.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_03T18_44_08.205769
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T18-44-08.205769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T18-44-08.205769.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_03T18_44_08.205769
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T18-44-08.205769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T18-44-08.205769.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_03T18_44_08.205769
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T18-44-08.205769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T18-44-08.205769.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_03T18_44_08.205769
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T18-44-08.205769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T18-44-08.205769.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_03T18_44_08.205769
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T18-44-08.205769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T18-44-08.205769.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_03T18_44_08.205769
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T18-44-08.205769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T18-44-08.205769.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_03T18_44_08.205769
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T18-44-08.205769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T18-44-08.205769.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_03T18_44_08.205769
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T18-44-08.205769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T18-44-08.205769.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_03T18_44_08.205769
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T18-44-08.205769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T18-44-08.205769.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_03T18_44_08.205769
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T18-44-08.205769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T18-44-08.205769.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_03T18_44_08.205769
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T18-44-08.205769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T18-44-08.205769.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_03T18_44_08.205769
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T18-44-08.205769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T18-44-08.205769.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_03T18_44_08.205769
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T18-44-08.205769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T18-44-08.205769.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_03T18_44_08.205769
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T18-44-08.205769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T18-44-08.205769.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_03T18_44_08.205769
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T18-44-08.205769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T18-44-08.205769.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_03T18_44_08.205769
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T18-44-08.205769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T18-44-08.205769.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_03T18_44_08.205769
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T18-44-08.205769.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T18-44-08.205769.parquet'
- config_name: results
data_files:
- split: 2023_10_03T18_44_08.205769
path:
- results_2023-10-03T18-44-08.205769.parquet
- split: latest
path:
- results_2023-10-03T18-44-08.205769.parquet
---
# Dataset Card for Evaluation run of Undi95/U-Amethyst-20B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Undi95/U-Amethyst-20B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Undi95/U-Amethyst-20B](https://huggingface.co/Undi95/U-Amethyst-20B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Undi95__U-Amethyst-20B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-03T18:44:08.205769](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__U-Amethyst-20B/blob/main/results_2023-10-03T18-44-08.205769.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5606420063064564,
"acc_stderr": 0.034398732723866246,
"acc_norm": 0.5644594145062996,
"acc_norm_stderr": 0.034377283403392314,
"mc1": 0.3598531211750306,
"mc1_stderr": 0.016801860466677143,
"mc2": 0.5320122237340842,
"mc2_stderr": 0.015624089171491088
},
"harness|arc:challenge|25": {
"acc": 0.5878839590443686,
"acc_stderr": 0.014383915302225403,
"acc_norm": 0.6220136518771331,
"acc_norm_stderr": 0.014169664520303101
},
"harness|hellaswag|10": {
"acc": 0.6400119498107947,
"acc_stderr": 0.004790155370993448,
"acc_norm": 0.8311093407687712,
"acc_norm_stderr": 0.0037388962449538144
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.39,
"acc_stderr": 0.049020713000019756,
"acc_norm": 0.39,
"acc_norm_stderr": 0.049020713000019756
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5526315789473685,
"acc_stderr": 0.04046336883978251,
"acc_norm": 0.5526315789473685,
"acc_norm_stderr": 0.04046336883978251
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6037735849056604,
"acc_stderr": 0.030102793781791197,
"acc_norm": 0.6037735849056604,
"acc_norm_stderr": 0.030102793781791197
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6180555555555556,
"acc_stderr": 0.040629907841466674,
"acc_norm": 0.6180555555555556,
"acc_norm_stderr": 0.040629907841466674
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5202312138728323,
"acc_stderr": 0.03809342081273957,
"acc_norm": 0.5202312138728323,
"acc_norm_stderr": 0.03809342081273957
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.046550104113196156,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.046550104113196156
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.03261936918467382,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.03261936918467382
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022056,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022056
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4896551724137931,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.4896551724137931,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3201058201058201,
"acc_stderr": 0.024026846392873502,
"acc_norm": 0.3201058201058201,
"acc_norm_stderr": 0.024026846392873502
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.0404061017820884,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.0404061017820884
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6548387096774193,
"acc_stderr": 0.02704574657353433,
"acc_norm": 0.6548387096774193,
"acc_norm_stderr": 0.02704574657353433
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.458128078817734,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.458128078817734,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.03663974994391244,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.03663974994391244
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.03191178226713547,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.03191178226713547
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8134715025906736,
"acc_stderr": 0.02811209121011748,
"acc_norm": 0.8134715025906736,
"acc_norm_stderr": 0.02811209121011748
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5666666666666667,
"acc_stderr": 0.025124653525885117,
"acc_norm": 0.5666666666666667,
"acc_norm_stderr": 0.025124653525885117
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.027195934804085622,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.027195934804085622
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.03156663099215416,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.03156663099215416
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.03879687024073327,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.03879687024073327
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7137614678899082,
"acc_stderr": 0.01937943662891998,
"acc_norm": 0.7137614678899082,
"acc_norm_stderr": 0.01937943662891998
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.03388857118502326,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.03388857118502326
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7696078431372549,
"acc_stderr": 0.029554292605695066,
"acc_norm": 0.7696078431372549,
"acc_norm_stderr": 0.029554292605695066
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7341772151898734,
"acc_stderr": 0.028756799629658346,
"acc_norm": 0.7341772151898734,
"acc_norm_stderr": 0.028756799629658346
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.03138147637575499,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.03138147637575499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6335877862595419,
"acc_stderr": 0.042258754519696365,
"acc_norm": 0.6335877862595419,
"acc_norm_stderr": 0.042258754519696365
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516304,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516304
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.044143436668549335,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.044143436668549335
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6687116564417178,
"acc_stderr": 0.03697983910025588,
"acc_norm": 0.6687116564417178,
"acc_norm_stderr": 0.03697983910025588
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.043270409325787296,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.043270409325787296
},
"harness|hendrycksTest-management|5": {
"acc": 0.6601941747572816,
"acc_stderr": 0.046897659372781335,
"acc_norm": 0.6601941747572816,
"acc_norm_stderr": 0.046897659372781335
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8076923076923077,
"acc_stderr": 0.025819233256483706,
"acc_norm": 0.8076923076923077,
"acc_norm_stderr": 0.025819233256483706
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7266922094508301,
"acc_stderr": 0.015936681062628556,
"acc_norm": 0.7266922094508301,
"acc_norm_stderr": 0.015936681062628556
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6502890173410405,
"acc_stderr": 0.02567428145653102,
"acc_norm": 0.6502890173410405,
"acc_norm_stderr": 0.02567428145653102
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3206703910614525,
"acc_stderr": 0.0156099295593484,
"acc_norm": 0.3206703910614525,
"acc_norm_stderr": 0.0156099295593484
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6568627450980392,
"acc_stderr": 0.02718449890994162,
"acc_norm": 0.6568627450980392,
"acc_norm_stderr": 0.02718449890994162
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.662379421221865,
"acc_stderr": 0.026858825879488544,
"acc_norm": 0.662379421221865,
"acc_norm_stderr": 0.026858825879488544
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6728395061728395,
"acc_stderr": 0.026105673861409814,
"acc_norm": 0.6728395061728395,
"acc_norm_stderr": 0.026105673861409814
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.425531914893617,
"acc_stderr": 0.02949482760014437,
"acc_norm": 0.425531914893617,
"acc_norm_stderr": 0.02949482760014437
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4445893089960887,
"acc_stderr": 0.012691575792657114,
"acc_norm": 0.4445893089960887,
"acc_norm_stderr": 0.012691575792657114
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5477941176470589,
"acc_stderr": 0.03023375855159644,
"acc_norm": 0.5477941176470589,
"acc_norm_stderr": 0.03023375855159644
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5784313725490197,
"acc_stderr": 0.019977422600227474,
"acc_norm": 0.5784313725490197,
"acc_norm_stderr": 0.019977422600227474
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5818181818181818,
"acc_stderr": 0.04724577405731572,
"acc_norm": 0.5818181818181818,
"acc_norm_stderr": 0.04724577405731572
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6204081632653061,
"acc_stderr": 0.03106721126287247,
"acc_norm": 0.6204081632653061,
"acc_norm_stderr": 0.03106721126287247
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7263681592039801,
"acc_stderr": 0.03152439186555402,
"acc_norm": 0.7263681592039801,
"acc_norm_stderr": 0.03152439186555402
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42771084337349397,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.42771084337349397,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7485380116959064,
"acc_stderr": 0.033275044238468436,
"acc_norm": 0.7485380116959064,
"acc_norm_stderr": 0.033275044238468436
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3598531211750306,
"mc1_stderr": 0.016801860466677143,
"mc2": 0.5320122237340842,
"mc2_stderr": 0.015624089171491088
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
atom-in-the-universe/bild-353d4d4e-c741-4f8a-97a8-e6d916a1eab2 | 2023-10-03T18:51:22.000Z | [
"region:us"
] | atom-in-the-universe | null | null | null | 0 | 0 | Entry not found |
atom-in-the-universe/bild-e28f8f9a-eb3e-4721-958b-d4888ceda541 | 2023-10-03T18:59:23.000Z | [
"region:us"
] | atom-in-the-universe | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_Yukang__Llama-2-13b-chat-longlora-32k-sft | 2023-10-03T19:10:28.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Yukang/Llama-2-13b-chat-longlora-32k-sft
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Yukang/Llama-2-13b-chat-longlora-32k-sft](https://huggingface.co/Yukang/Llama-2-13b-chat-longlora-32k-sft)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Yukang__Llama-2-13b-chat-longlora-32k-sft\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-03T19:09:03.932151](https://huggingface.co/datasets/open-llm-leaderboard/details_Yukang__Llama-2-13b-chat-longlora-32k-sft/blob/main/results_2023-10-03T19-09-03.932151.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2312095852777777,\n\
\ \"acc_stderr\": 0.03070521571119534,\n \"acc_norm\": 0.23219325107008887,\n\
\ \"acc_norm_stderr\": 0.030722369194984736,\n \"mc1\": 0.23745410036719705,\n\
\ \"mc1_stderr\": 0.014896277441041859,\n \"mc2\": 0.49066988843051723,\n\
\ \"mc2_stderr\": 0.016895332527288195\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.20733788395904437,\n \"acc_stderr\": 0.011846905782971361,\n\
\ \"acc_norm\": 0.26109215017064846,\n \"acc_norm_stderr\": 0.012835523909473855\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2574188408683529,\n\
\ \"acc_stderr\": 0.004363185172047181,\n \"acc_norm\": 0.261700856403107,\n\
\ \"acc_norm_stderr\": 0.004386622589119084\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.18518518518518517,\n\
\ \"acc_stderr\": 0.03355677216313142,\n \"acc_norm\": 0.18518518518518517,\n\
\ \"acc_norm_stderr\": 0.03355677216313142\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n\
\ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n\
\ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.02528839450289137,\n\
\ \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.02528839450289137\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n\
\ \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n\
\ \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n\
\ \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n\
\ \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"\
acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"\
acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n \"\
acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"\
acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n\
\ \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.20256410256410257,\n \"acc_stderr\": 0.020377660970371372,\n\
\ \"acc_norm\": 0.20256410256410257,\n \"acc_norm_stderr\": 0.020377660970371372\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \
\ \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n\
\ \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"\
acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"\
acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"\
acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n\
\ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n\
\ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n\
\ \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n\
\ \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23754789272030652,\n\
\ \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.23754789272030652,\n\
\ \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n\
\ \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n\
\ \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n\
\ \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \
\ \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n\
\ \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n\
\ \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n\
\ \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n\
\ \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n\
\ \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n\
\ \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n\
\ \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3216374269005848,\n\
\ \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.3216374269005848,\n\
\ \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.23745410036719705,\n \"mc1_stderr\": 0.014896277441041859,\n\
\ \"mc2\": 0.49066988843051723,\n \"mc2_stderr\": 0.016895332527288195\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Yukang/Llama-2-13b-chat-longlora-32k-sft
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_03T19_01_52.732036
path:
- '**/details_harness|arc:challenge|25_2023-10-03T19-01-52.732036.parquet'
- split: 2023_10_03T19_09_03.932151
path:
- '**/details_harness|arc:challenge|25_2023-10-03T19-09-03.932151.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-03T19-09-03.932151.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_03T19_01_52.732036
path:
- '**/details_harness|hellaswag|10_2023-10-03T19-01-52.732036.parquet'
- split: 2023_10_03T19_09_03.932151
path:
- '**/details_harness|hellaswag|10_2023-10-03T19-09-03.932151.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-03T19-09-03.932151.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_03T19_01_52.732036
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T19-01-52.732036.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T19-01-52.732036.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T19-01-52.732036.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T19-01-52.732036.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T19-01-52.732036.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T19-01-52.732036.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T19-01-52.732036.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T19-01-52.732036.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T19-01-52.732036.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T19-01-52.732036.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T19-01-52.732036.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T19-01-52.732036.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T19-01-52.732036.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T19-01-52.732036.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T19-01-52.732036.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T19-01-52.732036.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T19-01-52.732036.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T19-01-52.732036.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T19-01-52.732036.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T19-01-52.732036.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T19-01-52.732036.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T19-01-52.732036.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T19-01-52.732036.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T19-01-52.732036.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T19-01-52.732036.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T19-01-52.732036.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T19-01-52.732036.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T19-01-52.732036.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T19-01-52.732036.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T19-01-52.732036.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T19-01-52.732036.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T19-01-52.732036.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T19-01-52.732036.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T19-01-52.732036.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T19-01-52.732036.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T19-01-52.732036.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T19-01-52.732036.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T19-01-52.732036.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T19-01-52.732036.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T19-01-52.732036.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T19-01-52.732036.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T19-01-52.732036.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T19-01-52.732036.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T19-01-52.732036.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T19-01-52.732036.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T19-01-52.732036.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T19-01-52.732036.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T19-01-52.732036.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T19-01-52.732036.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T19-01-52.732036.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T19-01-52.732036.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T19-01-52.732036.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T19-01-52.732036.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T19-01-52.732036.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T19-01-52.732036.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T19-01-52.732036.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T19-01-52.732036.parquet'
- split: 2023_10_03T19_09_03.932151
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T19-09-03.932151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T19-09-03.932151.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T19-09-03.932151.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_03T19_01_52.732036
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T19-01-52.732036.parquet'
- split: 2023_10_03T19_09_03.932151
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T19-09-03.932151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T19-09-03.932151.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_03T19_01_52.732036
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T19-01-52.732036.parquet'
- split: 2023_10_03T19_09_03.932151
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T19-09-03.932151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T19-09-03.932151.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_03T19_01_52.732036
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T19-01-52.732036.parquet'
- split: 2023_10_03T19_09_03.932151
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T19-09-03.932151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T19-09-03.932151.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_03T19_01_52.732036
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T19-01-52.732036.parquet'
- split: 2023_10_03T19_09_03.932151
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T19-09-03.932151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T19-09-03.932151.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_03T19_01_52.732036
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T19-01-52.732036.parquet'
- split: 2023_10_03T19_09_03.932151
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T19-09-03.932151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T19-09-03.932151.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_03T19_01_52.732036
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T19-01-52.732036.parquet'
- split: 2023_10_03T19_09_03.932151
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T19-09-03.932151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T19-09-03.932151.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_03T19_01_52.732036
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T19-01-52.732036.parquet'
- split: 2023_10_03T19_09_03.932151
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T19-09-03.932151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T19-09-03.932151.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_03T19_01_52.732036
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T19-01-52.732036.parquet'
- split: 2023_10_03T19_09_03.932151
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T19-09-03.932151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T19-09-03.932151.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_03T19_01_52.732036
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T19-01-52.732036.parquet'
- split: 2023_10_03T19_09_03.932151
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T19-09-03.932151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T19-09-03.932151.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_03T19_01_52.732036
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T19-01-52.732036.parquet'
- split: 2023_10_03T19_09_03.932151
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T19-09-03.932151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T19-09-03.932151.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_03T19_01_52.732036
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T19-01-52.732036.parquet'
- split: 2023_10_03T19_09_03.932151
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T19-09-03.932151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T19-09-03.932151.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_03T19_01_52.732036
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T19-01-52.732036.parquet'
- split: 2023_10_03T19_09_03.932151
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T19-09-03.932151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T19-09-03.932151.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_03T19_01_52.732036
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T19-01-52.732036.parquet'
- split: 2023_10_03T19_09_03.932151
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T19-09-03.932151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T19-09-03.932151.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_03T19_01_52.732036
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T19-01-52.732036.parquet'
- split: 2023_10_03T19_09_03.932151
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T19-09-03.932151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T19-09-03.932151.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_03T19_01_52.732036
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T19-01-52.732036.parquet'
- split: 2023_10_03T19_09_03.932151
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T19-09-03.932151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T19-09-03.932151.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_03T19_01_52.732036
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T19-01-52.732036.parquet'
- split: 2023_10_03T19_09_03.932151
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T19-09-03.932151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T19-09-03.932151.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_03T19_01_52.732036
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T19-01-52.732036.parquet'
- split: 2023_10_03T19_09_03.932151
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T19-09-03.932151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T19-09-03.932151.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_03T19_01_52.732036
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T19-01-52.732036.parquet'
- split: 2023_10_03T19_09_03.932151
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T19-09-03.932151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T19-09-03.932151.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_03T19_01_52.732036
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T19-01-52.732036.parquet'
- split: 2023_10_03T19_09_03.932151
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T19-09-03.932151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T19-09-03.932151.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_03T19_01_52.732036
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T19-01-52.732036.parquet'
- split: 2023_10_03T19_09_03.932151
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T19-09-03.932151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T19-09-03.932151.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_03T19_01_52.732036
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T19-01-52.732036.parquet'
- split: 2023_10_03T19_09_03.932151
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T19-09-03.932151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T19-09-03.932151.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_03T19_01_52.732036
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T19-01-52.732036.parquet'
- split: 2023_10_03T19_09_03.932151
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T19-09-03.932151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T19-09-03.932151.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_03T19_01_52.732036
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T19-01-52.732036.parquet'
- split: 2023_10_03T19_09_03.932151
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T19-09-03.932151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T19-09-03.932151.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_03T19_01_52.732036
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T19-01-52.732036.parquet'
- split: 2023_10_03T19_09_03.932151
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T19-09-03.932151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T19-09-03.932151.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_03T19_01_52.732036
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T19-01-52.732036.parquet'
- split: 2023_10_03T19_09_03.932151
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T19-09-03.932151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T19-09-03.932151.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_03T19_01_52.732036
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T19-01-52.732036.parquet'
- split: 2023_10_03T19_09_03.932151
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T19-09-03.932151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T19-09-03.932151.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_03T19_01_52.732036
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T19-01-52.732036.parquet'
- split: 2023_10_03T19_09_03.932151
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T19-09-03.932151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T19-09-03.932151.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_03T19_01_52.732036
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T19-01-52.732036.parquet'
- split: 2023_10_03T19_09_03.932151
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T19-09-03.932151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T19-09-03.932151.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_03T19_01_52.732036
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T19-01-52.732036.parquet'
- split: 2023_10_03T19_09_03.932151
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T19-09-03.932151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T19-09-03.932151.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_03T19_01_52.732036
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T19-01-52.732036.parquet'
- split: 2023_10_03T19_09_03.932151
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T19-09-03.932151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T19-09-03.932151.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_03T19_01_52.732036
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T19-01-52.732036.parquet'
- split: 2023_10_03T19_09_03.932151
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T19-09-03.932151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T19-09-03.932151.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_03T19_01_52.732036
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T19-01-52.732036.parquet'
- split: 2023_10_03T19_09_03.932151
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T19-09-03.932151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T19-09-03.932151.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_03T19_01_52.732036
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T19-01-52.732036.parquet'
- split: 2023_10_03T19_09_03.932151
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T19-09-03.932151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T19-09-03.932151.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_03T19_01_52.732036
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T19-01-52.732036.parquet'
- split: 2023_10_03T19_09_03.932151
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T19-09-03.932151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T19-09-03.932151.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_03T19_01_52.732036
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T19-01-52.732036.parquet'
- split: 2023_10_03T19_09_03.932151
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T19-09-03.932151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T19-09-03.932151.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_03T19_01_52.732036
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T19-01-52.732036.parquet'
- split: 2023_10_03T19_09_03.932151
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T19-09-03.932151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T19-09-03.932151.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_03T19_01_52.732036
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T19-01-52.732036.parquet'
- split: 2023_10_03T19_09_03.932151
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T19-09-03.932151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T19-09-03.932151.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_03T19_01_52.732036
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T19-01-52.732036.parquet'
- split: 2023_10_03T19_09_03.932151
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T19-09-03.932151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T19-09-03.932151.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_03T19_01_52.732036
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T19-01-52.732036.parquet'
- split: 2023_10_03T19_09_03.932151
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T19-09-03.932151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T19-09-03.932151.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_03T19_01_52.732036
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T19-01-52.732036.parquet'
- split: 2023_10_03T19_09_03.932151
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T19-09-03.932151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T19-09-03.932151.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_03T19_01_52.732036
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T19-01-52.732036.parquet'
- split: 2023_10_03T19_09_03.932151
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T19-09-03.932151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T19-09-03.932151.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_03T19_01_52.732036
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T19-01-52.732036.parquet'
- split: 2023_10_03T19_09_03.932151
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T19-09-03.932151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T19-09-03.932151.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_03T19_01_52.732036
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T19-01-52.732036.parquet'
- split: 2023_10_03T19_09_03.932151
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T19-09-03.932151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T19-09-03.932151.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_03T19_01_52.732036
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T19-01-52.732036.parquet'
- split: 2023_10_03T19_09_03.932151
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T19-09-03.932151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T19-09-03.932151.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_03T19_01_52.732036
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T19-01-52.732036.parquet'
- split: 2023_10_03T19_09_03.932151
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T19-09-03.932151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T19-09-03.932151.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_03T19_01_52.732036
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T19-01-52.732036.parquet'
- split: 2023_10_03T19_09_03.932151
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T19-09-03.932151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T19-09-03.932151.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_03T19_01_52.732036
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T19-01-52.732036.parquet'
- split: 2023_10_03T19_09_03.932151
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T19-09-03.932151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T19-09-03.932151.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_03T19_01_52.732036
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T19-01-52.732036.parquet'
- split: 2023_10_03T19_09_03.932151
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T19-09-03.932151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T19-09-03.932151.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_03T19_01_52.732036
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T19-01-52.732036.parquet'
- split: 2023_10_03T19_09_03.932151
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T19-09-03.932151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T19-09-03.932151.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_03T19_01_52.732036
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T19-01-52.732036.parquet'
- split: 2023_10_03T19_09_03.932151
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T19-09-03.932151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T19-09-03.932151.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_03T19_01_52.732036
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T19-01-52.732036.parquet'
- split: 2023_10_03T19_09_03.932151
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T19-09-03.932151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T19-09-03.932151.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_03T19_01_52.732036
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T19-01-52.732036.parquet'
- split: 2023_10_03T19_09_03.932151
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T19-09-03.932151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T19-09-03.932151.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_03T19_01_52.732036
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T19-01-52.732036.parquet'
- split: 2023_10_03T19_09_03.932151
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T19-09-03.932151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T19-09-03.932151.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_03T19_01_52.732036
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T19-01-52.732036.parquet'
- split: 2023_10_03T19_09_03.932151
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T19-09-03.932151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T19-09-03.932151.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_03T19_01_52.732036
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T19-01-52.732036.parquet'
- split: 2023_10_03T19_09_03.932151
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T19-09-03.932151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T19-09-03.932151.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_03T19_01_52.732036
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T19-01-52.732036.parquet'
- split: 2023_10_03T19_09_03.932151
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T19-09-03.932151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T19-09-03.932151.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_03T19_01_52.732036
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T19-01-52.732036.parquet'
- split: 2023_10_03T19_09_03.932151
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T19-09-03.932151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T19-09-03.932151.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_03T19_01_52.732036
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T19-01-52.732036.parquet'
- split: 2023_10_03T19_09_03.932151
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T19-09-03.932151.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T19-09-03.932151.parquet'
- config_name: results
data_files:
- split: 2023_10_03T19_01_52.732036
path:
- results_2023-10-03T19-01-52.732036.parquet
- split: 2023_10_03T19_09_03.932151
path:
- results_2023-10-03T19-09-03.932151.parquet
- split: latest
path:
- results_2023-10-03T19-09-03.932151.parquet
---
# Dataset Card for Evaluation run of Yukang/Llama-2-13b-chat-longlora-32k-sft
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Yukang/Llama-2-13b-chat-longlora-32k-sft
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Yukang/Llama-2-13b-chat-longlora-32k-sft](https://huggingface.co/Yukang/Llama-2-13b-chat-longlora-32k-sft) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Yukang__Llama-2-13b-chat-longlora-32k-sft",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-03T19:09:03.932151](https://huggingface.co/datasets/open-llm-leaderboard/details_Yukang__Llama-2-13b-chat-longlora-32k-sft/blob/main/results_2023-10-03T19-09-03.932151.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2312095852777777,
"acc_stderr": 0.03070521571119534,
"acc_norm": 0.23219325107008887,
"acc_norm_stderr": 0.030722369194984736,
"mc1": 0.23745410036719705,
"mc1_stderr": 0.014896277441041859,
"mc2": 0.49066988843051723,
"mc2_stderr": 0.016895332527288195
},
"harness|arc:challenge|25": {
"acc": 0.20733788395904437,
"acc_stderr": 0.011846905782971361,
"acc_norm": 0.26109215017064846,
"acc_norm_stderr": 0.012835523909473855
},
"harness|hellaswag|10": {
"acc": 0.2574188408683529,
"acc_stderr": 0.004363185172047181,
"acc_norm": 0.261700856403107,
"acc_norm_stderr": 0.004386622589119084
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03355677216313142,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03355677216313142
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21509433962264152,
"acc_stderr": 0.02528839450289137,
"acc_norm": 0.21509433962264152,
"acc_norm_stderr": 0.02528839450289137
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.02094048156533486,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.02094048156533486
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04040610178208841,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04040610178208841
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1774193548387097,
"acc_stderr": 0.02173254068932927,
"acc_norm": 0.1774193548387097,
"acc_norm_stderr": 0.02173254068932927
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15270935960591134,
"acc_stderr": 0.02530890453938063,
"acc_norm": 0.15270935960591134,
"acc_norm_stderr": 0.02530890453938063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860664,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860664
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.20256410256410257,
"acc_stderr": 0.020377660970371372,
"acc_norm": 0.20256410256410257,
"acc_norm_stderr": 0.020377660970371372
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.024882116857655075,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.024882116857655075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1926605504587156,
"acc_stderr": 0.016909276884936094,
"acc_norm": 0.1926605504587156,
"acc_norm_stderr": 0.016909276884936094
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1527777777777778,
"acc_stderr": 0.024536326026134224,
"acc_norm": 0.1527777777777778,
"acc_norm_stderr": 0.024536326026134224
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.31390134529147984,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.31390134529147984,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2905982905982906,
"acc_stderr": 0.02974504857267404,
"acc_norm": 0.2905982905982906,
"acc_norm_stderr": 0.02974504857267404
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23754789272030652,
"acc_stderr": 0.015218733046150193,
"acc_norm": 0.23754789272030652,
"acc_norm_stderr": 0.015218733046150193
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432417,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432417
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23745410036719705,
"mc1_stderr": 0.014896277441041859,
"mc2": 0.49066988843051723,
"mc2_stderr": 0.016895332527288195
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
snats/chico2prompts | 2023-10-03T19:12:33.000Z | [
"license:cc-by-4.0",
"region:us"
] | snats | null | null | null | 0 | 0 | ---
license: cc-by-4.0
---
# chico2prompts
There are 2 files, they follow two different prompts. They are in 2 different csv files in Spanish.
# Prompts
First prompt: Suggest a title for the following.
In english:
```
Suggest a title for the following story:
{{contents}}
completion:
Sure, here's a suitable title for the given story {{titles}}.
```
In spanish:
```
Sugiere un título para la siguiente historia: {{contents}}
Completado por lo siguiente:
Un título posible para la siguiente historia podría ser: {{titles}}
```
Second prompt: Write a short story
In english:
```
prompt:
Write a short story based on the following title:
{{titles}}
completion:
{{contents}}
```
In spanish:
```
prompt:
Escribe una historia corta basada en el siguiente título {{titles}}
completion:
{{contents}}
```
|
RaymondLi/the_stack_v2_python | 2023-10-03T19:07:01.000Z | [
"region:us"
] | RaymondLi | null | null | null | 0 | 0 | Stack-v2 Python data.
Deduped, filtered, and decontaminated.
Includes permissively-licensed, and no-license data. Non-permissive data is excluded.
|
Intuit-GenSRF/es_lawyer_instruct | 2023-10-03T19:14:34.000Z | [
"region:us"
] | Intuit-GenSRF | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
- name: input
dtype: float64
- name: split
dtype: string
- name: text
dtype: string
- name: text_spanish
dtype: string
splits:
- name: train
num_bytes: 16852186
num_examples: 9241
download_size: 7403208
dataset_size: 16852186
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "es_lawyer_instruct"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Intuit-GenSRF/es_mmlu_law | 2023-10-03T19:15:36.000Z | [
"region:us"
] | Intuit-GenSRF | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
dtype: string
- name: answer
dtype: int64
- name: negate_openai_prompt
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: split
dtype: string
- name: text
dtype: string
- name: text_spanish
dtype: string
splits:
- name: train
num_bytes: 11082953
num_examples: 1539
download_size: 3462359
dataset_size: 11082953
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "es_mmlu_law"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Intuit-GenSRF/es_legal_advice_reddit | 2023-10-03T19:17:04.000Z | [
"region:us"
] | Intuit-GenSRF | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: created_utc
dtype: int64
- name: full_link
dtype: string
- name: id
dtype: string
- name: body
dtype: string
- name: title
dtype: string
- name: text_label
dtype: string
- name: flair_label
dtype: int64
- name: split
dtype: string
- name: text
dtype: string
- name: text_spanish
dtype: string
splits:
- name: train
num_bytes: 404835305
num_examples: 98910
download_size: 244411822
dataset_size: 404835305
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "es_legal_advice_reddit"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_dhmeltzer__Llama-2-7b-hf-eli5-cleaned-1024_qlora_merged | 2023-10-03T19:19:33.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of dhmeltzer/Llama-2-7b-hf-eli5-cleaned-1024_qlora_merged
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [dhmeltzer/Llama-2-7b-hf-eli5-cleaned-1024_qlora_merged](https://huggingface.co/dhmeltzer/Llama-2-7b-hf-eli5-cleaned-1024_qlora_merged)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_dhmeltzer__Llama-2-7b-hf-eli5-cleaned-1024_qlora_merged\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-03T19:18:10.138787](https://huggingface.co/datasets/open-llm-leaderboard/details_dhmeltzer__Llama-2-7b-hf-eli5-cleaned-1024_qlora_merged/blob/main/results_2023-10-03T19-18-10.138787.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4615194630526937,\n\
\ \"acc_stderr\": 0.03520699958636604,\n \"acc_norm\": 0.46578691028395397,\n\
\ \"acc_norm_stderr\": 0.03519297242720297,\n \"mc1\": 0.29865361077111385,\n\
\ \"mc1_stderr\": 0.016021570613768542,\n \"mc2\": 0.4612985634358318,\n\
\ \"mc2_stderr\": 0.014133312214187982\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.48208191126279865,\n \"acc_stderr\": 0.014602005585490978,\n\
\ \"acc_norm\": 0.5366894197952219,\n \"acc_norm_stderr\": 0.014572000527756994\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5849432383987253,\n\
\ \"acc_stderr\": 0.004917248150601852,\n \"acc_norm\": 0.7821151165106552,\n\
\ \"acc_norm_stderr\": 0.004119650817714286\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45925925925925926,\n\
\ \"acc_stderr\": 0.04304979692464243,\n \"acc_norm\": 0.45925925925925926,\n\
\ \"acc_norm_stderr\": 0.04304979692464243\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4407894736842105,\n \"acc_stderr\": 0.040403110624904356,\n\
\ \"acc_norm\": 0.4407894736842105,\n \"acc_norm_stderr\": 0.040403110624904356\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.45,\n\
\ \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\"\
: 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"\
acc\": 0.4490566037735849,\n \"acc_stderr\": 0.030612730713641095,\n \
\ \"acc_norm\": 0.4490566037735849,\n \"acc_norm_stderr\": 0.030612730713641095\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04148415739394154,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04148415739394154\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4277456647398844,\n\
\ \"acc_stderr\": 0.037724468575180255,\n \"acc_norm\": 0.4277456647398844,\n\
\ \"acc_norm_stderr\": 0.037724468575180255\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171453,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171453\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n\
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.37446808510638296,\n \"acc_stderr\": 0.03163910665367291,\n\
\ \"acc_norm\": 0.37446808510638296,\n \"acc_norm_stderr\": 0.03163910665367291\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.04266339443159393,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.04266339443159393\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.47586206896551725,\n \"acc_stderr\": 0.041618085035015295,\n\
\ \"acc_norm\": 0.47586206896551725,\n \"acc_norm_stderr\": 0.041618085035015295\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2698412698412698,\n \"acc_stderr\": 0.022860838309232072,\n \"\
acc_norm\": 0.2698412698412698,\n \"acc_norm_stderr\": 0.022860838309232072\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n\
\ \"acc_stderr\": 0.043255060420170854,\n \"acc_norm\": 0.373015873015873,\n\
\ \"acc_norm_stderr\": 0.043255060420170854\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.47096774193548385,\n\
\ \"acc_stderr\": 0.028396016402761005,\n \"acc_norm\": 0.47096774193548385,\n\
\ \"acc_norm_stderr\": 0.028396016402761005\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.33004926108374383,\n \"acc_stderr\": 0.033085304262282574,\n\
\ \"acc_norm\": 0.33004926108374383,\n \"acc_norm_stderr\": 0.033085304262282574\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\"\
: 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5878787878787879,\n \"acc_stderr\": 0.038435669935887165,\n\
\ \"acc_norm\": 0.5878787878787879,\n \"acc_norm_stderr\": 0.038435669935887165\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.48484848484848486,\n \"acc_stderr\": 0.0356071651653106,\n \"\
acc_norm\": 0.48484848484848486,\n \"acc_norm_stderr\": 0.0356071651653106\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6528497409326425,\n \"acc_stderr\": 0.03435696168361355,\n\
\ \"acc_norm\": 0.6528497409326425,\n \"acc_norm_stderr\": 0.03435696168361355\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.43846153846153846,\n \"acc_stderr\": 0.025158266016868557,\n\
\ \"acc_norm\": 0.43846153846153846,\n \"acc_norm_stderr\": 0.025158266016868557\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.27037037037037037,\n \"acc_stderr\": 0.027080372815145665,\n \
\ \"acc_norm\": 0.27037037037037037,\n \"acc_norm_stderr\": 0.027080372815145665\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.03214536859788639,\n\
\ \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.03214536859788639\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2913907284768212,\n \"acc_stderr\": 0.03710185726119995,\n \"\
acc_norm\": 0.2913907284768212,\n \"acc_norm_stderr\": 0.03710185726119995\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6201834862385321,\n \"acc_stderr\": 0.020808825617866244,\n \"\
acc_norm\": 0.6201834862385321,\n \"acc_norm_stderr\": 0.020808825617866244\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.21296296296296297,\n \"acc_stderr\": 0.027920963147993652,\n \"\
acc_norm\": 0.21296296296296297,\n \"acc_norm_stderr\": 0.027920963147993652\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5392156862745098,\n \"acc_stderr\": 0.03498501649369527,\n \"\
acc_norm\": 0.5392156862745098,\n \"acc_norm_stderr\": 0.03498501649369527\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6160337552742616,\n \"acc_stderr\": 0.031658678064106674,\n \
\ \"acc_norm\": 0.6160337552742616,\n \"acc_norm_stderr\": 0.031658678064106674\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5605381165919282,\n\
\ \"acc_stderr\": 0.033310925110381785,\n \"acc_norm\": 0.5605381165919282,\n\
\ \"acc_norm_stderr\": 0.033310925110381785\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5877862595419847,\n \"acc_stderr\": 0.04317171194870255,\n\
\ \"acc_norm\": 0.5877862595419847,\n \"acc_norm_stderr\": 0.04317171194870255\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.628099173553719,\n \"acc_stderr\": 0.04412015806624504,\n \"acc_norm\"\
: 0.628099173553719,\n \"acc_norm_stderr\": 0.04412015806624504\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5185185185185185,\n\
\ \"acc_stderr\": 0.04830366024635331,\n \"acc_norm\": 0.5185185185185185,\n\
\ \"acc_norm_stderr\": 0.04830366024635331\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5030674846625767,\n \"acc_stderr\": 0.03928297078179663,\n\
\ \"acc_norm\": 0.5030674846625767,\n \"acc_norm_stderr\": 0.03928297078179663\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.36607142857142855,\n\
\ \"acc_stderr\": 0.0457237235873743,\n \"acc_norm\": 0.36607142857142855,\n\
\ \"acc_norm_stderr\": 0.0457237235873743\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5922330097087378,\n \"acc_stderr\": 0.048657775704107696,\n\
\ \"acc_norm\": 0.5922330097087378,\n \"acc_norm_stderr\": 0.048657775704107696\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.688034188034188,\n\
\ \"acc_stderr\": 0.030351527323344937,\n \"acc_norm\": 0.688034188034188,\n\
\ \"acc_norm_stderr\": 0.030351527323344937\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6424010217113666,\n\
\ \"acc_stderr\": 0.017139488998803284,\n \"acc_norm\": 0.6424010217113666,\n\
\ \"acc_norm_stderr\": 0.017139488998803284\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5086705202312138,\n \"acc_stderr\": 0.02691504735536981,\n\
\ \"acc_norm\": 0.5086705202312138,\n \"acc_norm_stderr\": 0.02691504735536981\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24134078212290502,\n\
\ \"acc_stderr\": 0.014310999547961443,\n \"acc_norm\": 0.24134078212290502,\n\
\ \"acc_norm_stderr\": 0.014310999547961443\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.4934640522875817,\n \"acc_stderr\": 0.028627470550556047,\n\
\ \"acc_norm\": 0.4934640522875817,\n \"acc_norm_stderr\": 0.028627470550556047\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5916398713826366,\n\
\ \"acc_stderr\": 0.02791705074848462,\n \"acc_norm\": 0.5916398713826366,\n\
\ \"acc_norm_stderr\": 0.02791705074848462\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.47530864197530864,\n \"acc_stderr\": 0.02778680093142745,\n\
\ \"acc_norm\": 0.47530864197530864,\n \"acc_norm_stderr\": 0.02778680093142745\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.34397163120567376,\n \"acc_stderr\": 0.028338017428611324,\n \
\ \"acc_norm\": 0.34397163120567376,\n \"acc_norm_stderr\": 0.028338017428611324\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.36310299869621904,\n\
\ \"acc_stderr\": 0.012282264406018754,\n \"acc_norm\": 0.36310299869621904,\n\
\ \"acc_norm_stderr\": 0.012282264406018754\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4889705882352941,\n \"acc_stderr\": 0.030365446477275668,\n\
\ \"acc_norm\": 0.4889705882352941,\n \"acc_norm_stderr\": 0.030365446477275668\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4477124183006536,\n \"acc_stderr\": 0.02011692534742242,\n \
\ \"acc_norm\": 0.4477124183006536,\n \"acc_norm_stderr\": 0.02011692534742242\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5181818181818182,\n\
\ \"acc_stderr\": 0.04785964010794915,\n \"acc_norm\": 0.5181818181818182,\n\
\ \"acc_norm_stderr\": 0.04785964010794915\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4530612244897959,\n \"acc_stderr\": 0.03186785930004129,\n\
\ \"acc_norm\": 0.4530612244897959,\n \"acc_norm_stderr\": 0.03186785930004129\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6467661691542289,\n\
\ \"acc_stderr\": 0.03379790611796777,\n \"acc_norm\": 0.6467661691542289,\n\
\ \"acc_norm_stderr\": 0.03379790611796777\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42168674698795183,\n\
\ \"acc_stderr\": 0.03844453181770917,\n \"acc_norm\": 0.42168674698795183,\n\
\ \"acc_norm_stderr\": 0.03844453181770917\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.695906432748538,\n \"acc_stderr\": 0.035282112582452306,\n\
\ \"acc_norm\": 0.695906432748538,\n \"acc_norm_stderr\": 0.035282112582452306\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.29865361077111385,\n\
\ \"mc1_stderr\": 0.016021570613768542,\n \"mc2\": 0.4612985634358318,\n\
\ \"mc2_stderr\": 0.014133312214187982\n }\n}\n```"
repo_url: https://huggingface.co/dhmeltzer/Llama-2-7b-hf-eli5-cleaned-1024_qlora_merged
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_03T19_18_10.138787
path:
- '**/details_harness|arc:challenge|25_2023-10-03T19-18-10.138787.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-03T19-18-10.138787.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_03T19_18_10.138787
path:
- '**/details_harness|hellaswag|10_2023-10-03T19-18-10.138787.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-03T19-18-10.138787.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_03T19_18_10.138787
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T19-18-10.138787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T19-18-10.138787.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T19-18-10.138787.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_03T19_18_10.138787
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T19-18-10.138787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T19-18-10.138787.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_03T19_18_10.138787
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T19-18-10.138787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T19-18-10.138787.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_03T19_18_10.138787
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T19-18-10.138787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T19-18-10.138787.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_03T19_18_10.138787
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T19-18-10.138787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T19-18-10.138787.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_03T19_18_10.138787
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T19-18-10.138787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T19-18-10.138787.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_03T19_18_10.138787
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T19-18-10.138787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T19-18-10.138787.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_03T19_18_10.138787
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T19-18-10.138787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T19-18-10.138787.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_03T19_18_10.138787
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T19-18-10.138787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T19-18-10.138787.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_03T19_18_10.138787
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T19-18-10.138787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T19-18-10.138787.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_03T19_18_10.138787
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T19-18-10.138787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T19-18-10.138787.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_03T19_18_10.138787
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T19-18-10.138787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T19-18-10.138787.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_03T19_18_10.138787
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T19-18-10.138787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T19-18-10.138787.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_03T19_18_10.138787
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T19-18-10.138787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T19-18-10.138787.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_03T19_18_10.138787
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T19-18-10.138787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T19-18-10.138787.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_03T19_18_10.138787
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T19-18-10.138787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T19-18-10.138787.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_03T19_18_10.138787
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T19-18-10.138787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T19-18-10.138787.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_03T19_18_10.138787
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T19-18-10.138787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T19-18-10.138787.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_03T19_18_10.138787
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T19-18-10.138787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T19-18-10.138787.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_03T19_18_10.138787
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T19-18-10.138787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T19-18-10.138787.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_03T19_18_10.138787
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T19-18-10.138787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T19-18-10.138787.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_03T19_18_10.138787
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T19-18-10.138787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T19-18-10.138787.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_03T19_18_10.138787
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T19-18-10.138787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T19-18-10.138787.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_03T19_18_10.138787
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T19-18-10.138787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T19-18-10.138787.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_03T19_18_10.138787
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T19-18-10.138787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T19-18-10.138787.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_03T19_18_10.138787
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T19-18-10.138787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T19-18-10.138787.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_03T19_18_10.138787
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T19-18-10.138787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T19-18-10.138787.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_03T19_18_10.138787
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T19-18-10.138787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T19-18-10.138787.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_03T19_18_10.138787
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T19-18-10.138787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T19-18-10.138787.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_03T19_18_10.138787
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T19-18-10.138787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T19-18-10.138787.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_03T19_18_10.138787
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T19-18-10.138787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T19-18-10.138787.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_03T19_18_10.138787
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T19-18-10.138787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T19-18-10.138787.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_03T19_18_10.138787
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T19-18-10.138787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T19-18-10.138787.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_03T19_18_10.138787
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T19-18-10.138787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T19-18-10.138787.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_03T19_18_10.138787
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T19-18-10.138787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T19-18-10.138787.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_03T19_18_10.138787
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T19-18-10.138787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T19-18-10.138787.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_03T19_18_10.138787
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T19-18-10.138787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T19-18-10.138787.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_03T19_18_10.138787
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T19-18-10.138787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T19-18-10.138787.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_03T19_18_10.138787
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T19-18-10.138787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T19-18-10.138787.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_03T19_18_10.138787
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T19-18-10.138787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T19-18-10.138787.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_03T19_18_10.138787
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T19-18-10.138787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T19-18-10.138787.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_03T19_18_10.138787
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T19-18-10.138787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T19-18-10.138787.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_03T19_18_10.138787
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T19-18-10.138787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T19-18-10.138787.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_03T19_18_10.138787
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T19-18-10.138787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T19-18-10.138787.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_03T19_18_10.138787
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T19-18-10.138787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T19-18-10.138787.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_03T19_18_10.138787
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T19-18-10.138787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T19-18-10.138787.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_03T19_18_10.138787
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T19-18-10.138787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T19-18-10.138787.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_03T19_18_10.138787
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T19-18-10.138787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T19-18-10.138787.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_03T19_18_10.138787
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T19-18-10.138787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T19-18-10.138787.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_03T19_18_10.138787
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T19-18-10.138787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T19-18-10.138787.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_03T19_18_10.138787
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T19-18-10.138787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T19-18-10.138787.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_03T19_18_10.138787
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T19-18-10.138787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T19-18-10.138787.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_03T19_18_10.138787
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T19-18-10.138787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T19-18-10.138787.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_03T19_18_10.138787
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T19-18-10.138787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T19-18-10.138787.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_03T19_18_10.138787
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T19-18-10.138787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T19-18-10.138787.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_03T19_18_10.138787
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T19-18-10.138787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T19-18-10.138787.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_03T19_18_10.138787
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T19-18-10.138787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T19-18-10.138787.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_03T19_18_10.138787
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T19-18-10.138787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T19-18-10.138787.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_03T19_18_10.138787
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T19-18-10.138787.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T19-18-10.138787.parquet'
- config_name: results
data_files:
- split: 2023_10_03T19_18_10.138787
path:
- results_2023-10-03T19-18-10.138787.parquet
- split: latest
path:
- results_2023-10-03T19-18-10.138787.parquet
---
# Dataset Card for Evaluation run of dhmeltzer/Llama-2-7b-hf-eli5-cleaned-1024_qlora_merged
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/dhmeltzer/Llama-2-7b-hf-eli5-cleaned-1024_qlora_merged
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [dhmeltzer/Llama-2-7b-hf-eli5-cleaned-1024_qlora_merged](https://huggingface.co/dhmeltzer/Llama-2-7b-hf-eli5-cleaned-1024_qlora_merged) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_dhmeltzer__Llama-2-7b-hf-eli5-cleaned-1024_qlora_merged",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-03T19:18:10.138787](https://huggingface.co/datasets/open-llm-leaderboard/details_dhmeltzer__Llama-2-7b-hf-eli5-cleaned-1024_qlora_merged/blob/main/results_2023-10-03T19-18-10.138787.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4615194630526937,
"acc_stderr": 0.03520699958636604,
"acc_norm": 0.46578691028395397,
"acc_norm_stderr": 0.03519297242720297,
"mc1": 0.29865361077111385,
"mc1_stderr": 0.016021570613768542,
"mc2": 0.4612985634358318,
"mc2_stderr": 0.014133312214187982
},
"harness|arc:challenge|25": {
"acc": 0.48208191126279865,
"acc_stderr": 0.014602005585490978,
"acc_norm": 0.5366894197952219,
"acc_norm_stderr": 0.014572000527756994
},
"harness|hellaswag|10": {
"acc": 0.5849432383987253,
"acc_stderr": 0.004917248150601852,
"acc_norm": 0.7821151165106552,
"acc_norm_stderr": 0.004119650817714286
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45925925925925926,
"acc_stderr": 0.04304979692464243,
"acc_norm": 0.45925925925925926,
"acc_norm_stderr": 0.04304979692464243
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4407894736842105,
"acc_stderr": 0.040403110624904356,
"acc_norm": 0.4407894736842105,
"acc_norm_stderr": 0.040403110624904356
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4490566037735849,
"acc_stderr": 0.030612730713641095,
"acc_norm": 0.4490566037735849,
"acc_norm_stderr": 0.030612730713641095
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4375,
"acc_stderr": 0.04148415739394154,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04148415739394154
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4277456647398844,
"acc_stderr": 0.037724468575180255,
"acc_norm": 0.4277456647398844,
"acc_norm_stderr": 0.037724468575180255
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.04220773659171453,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.04220773659171453
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.37446808510638296,
"acc_stderr": 0.03163910665367291,
"acc_norm": 0.37446808510638296,
"acc_norm_stderr": 0.03163910665367291
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159393,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159393
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.47586206896551725,
"acc_stderr": 0.041618085035015295,
"acc_norm": 0.47586206896551725,
"acc_norm_stderr": 0.041618085035015295
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2698412698412698,
"acc_stderr": 0.022860838309232072,
"acc_norm": 0.2698412698412698,
"acc_norm_stderr": 0.022860838309232072
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.043255060420170854,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.043255060420170854
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.47096774193548385,
"acc_stderr": 0.028396016402761005,
"acc_norm": 0.47096774193548385,
"acc_norm_stderr": 0.028396016402761005
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.33004926108374383,
"acc_stderr": 0.033085304262282574,
"acc_norm": 0.33004926108374383,
"acc_norm_stderr": 0.033085304262282574
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5878787878787879,
"acc_stderr": 0.038435669935887165,
"acc_norm": 0.5878787878787879,
"acc_norm_stderr": 0.038435669935887165
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.48484848484848486,
"acc_stderr": 0.0356071651653106,
"acc_norm": 0.48484848484848486,
"acc_norm_stderr": 0.0356071651653106
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6528497409326425,
"acc_stderr": 0.03435696168361355,
"acc_norm": 0.6528497409326425,
"acc_norm_stderr": 0.03435696168361355
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.43846153846153846,
"acc_stderr": 0.025158266016868557,
"acc_norm": 0.43846153846153846,
"acc_norm_stderr": 0.025158266016868557
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.27037037037037037,
"acc_stderr": 0.027080372815145665,
"acc_norm": 0.27037037037037037,
"acc_norm_stderr": 0.027080372815145665
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.03214536859788639,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.03214536859788639
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2913907284768212,
"acc_stderr": 0.03710185726119995,
"acc_norm": 0.2913907284768212,
"acc_norm_stderr": 0.03710185726119995
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6201834862385321,
"acc_stderr": 0.020808825617866244,
"acc_norm": 0.6201834862385321,
"acc_norm_stderr": 0.020808825617866244
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.21296296296296297,
"acc_stderr": 0.027920963147993652,
"acc_norm": 0.21296296296296297,
"acc_norm_stderr": 0.027920963147993652
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5392156862745098,
"acc_stderr": 0.03498501649369527,
"acc_norm": 0.5392156862745098,
"acc_norm_stderr": 0.03498501649369527
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6160337552742616,
"acc_stderr": 0.031658678064106674,
"acc_norm": 0.6160337552742616,
"acc_norm_stderr": 0.031658678064106674
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5605381165919282,
"acc_stderr": 0.033310925110381785,
"acc_norm": 0.5605381165919282,
"acc_norm_stderr": 0.033310925110381785
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5877862595419847,
"acc_stderr": 0.04317171194870255,
"acc_norm": 0.5877862595419847,
"acc_norm_stderr": 0.04317171194870255
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.628099173553719,
"acc_stderr": 0.04412015806624504,
"acc_norm": 0.628099173553719,
"acc_norm_stderr": 0.04412015806624504
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.04830366024635331,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.04830366024635331
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5030674846625767,
"acc_stderr": 0.03928297078179663,
"acc_norm": 0.5030674846625767,
"acc_norm_stderr": 0.03928297078179663
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.36607142857142855,
"acc_stderr": 0.0457237235873743,
"acc_norm": 0.36607142857142855,
"acc_norm_stderr": 0.0457237235873743
},
"harness|hendrycksTest-management|5": {
"acc": 0.5922330097087378,
"acc_stderr": 0.048657775704107696,
"acc_norm": 0.5922330097087378,
"acc_norm_stderr": 0.048657775704107696
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.688034188034188,
"acc_stderr": 0.030351527323344937,
"acc_norm": 0.688034188034188,
"acc_norm_stderr": 0.030351527323344937
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6424010217113666,
"acc_stderr": 0.017139488998803284,
"acc_norm": 0.6424010217113666,
"acc_norm_stderr": 0.017139488998803284
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5086705202312138,
"acc_stderr": 0.02691504735536981,
"acc_norm": 0.5086705202312138,
"acc_norm_stderr": 0.02691504735536981
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24134078212290502,
"acc_stderr": 0.014310999547961443,
"acc_norm": 0.24134078212290502,
"acc_norm_stderr": 0.014310999547961443
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4934640522875817,
"acc_stderr": 0.028627470550556047,
"acc_norm": 0.4934640522875817,
"acc_norm_stderr": 0.028627470550556047
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5916398713826366,
"acc_stderr": 0.02791705074848462,
"acc_norm": 0.5916398713826366,
"acc_norm_stderr": 0.02791705074848462
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.47530864197530864,
"acc_stderr": 0.02778680093142745,
"acc_norm": 0.47530864197530864,
"acc_norm_stderr": 0.02778680093142745
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.34397163120567376,
"acc_stderr": 0.028338017428611324,
"acc_norm": 0.34397163120567376,
"acc_norm_stderr": 0.028338017428611324
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.36310299869621904,
"acc_stderr": 0.012282264406018754,
"acc_norm": 0.36310299869621904,
"acc_norm_stderr": 0.012282264406018754
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4889705882352941,
"acc_stderr": 0.030365446477275668,
"acc_norm": 0.4889705882352941,
"acc_norm_stderr": 0.030365446477275668
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4477124183006536,
"acc_stderr": 0.02011692534742242,
"acc_norm": 0.4477124183006536,
"acc_norm_stderr": 0.02011692534742242
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5181818181818182,
"acc_stderr": 0.04785964010794915,
"acc_norm": 0.5181818181818182,
"acc_norm_stderr": 0.04785964010794915
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4530612244897959,
"acc_stderr": 0.03186785930004129,
"acc_norm": 0.4530612244897959,
"acc_norm_stderr": 0.03186785930004129
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6467661691542289,
"acc_stderr": 0.03379790611796777,
"acc_norm": 0.6467661691542289,
"acc_norm_stderr": 0.03379790611796777
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42168674698795183,
"acc_stderr": 0.03844453181770917,
"acc_norm": 0.42168674698795183,
"acc_norm_stderr": 0.03844453181770917
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.695906432748538,
"acc_stderr": 0.035282112582452306,
"acc_norm": 0.695906432748538,
"acc_norm_stderr": 0.035282112582452306
},
"harness|truthfulqa:mc|0": {
"mc1": 0.29865361077111385,
"mc1_stderr": 0.016021570613768542,
"mc2": 0.4612985634358318,
"mc2_stderr": 0.014133312214187982
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Intuit-GenSRF/es_mental_health_counseling | 2023-10-03T19:21:50.000Z | [
"region:us"
] | Intuit-GenSRF | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: Context
dtype: string
- name: Response
dtype: string
- name: split
dtype: string
- name: text
dtype: string
- name: text_spanish
dtype: string
splits:
- name: train
num_bytes: 13763461
num_examples: 3512
download_size: 7425319
dataset_size: 13763461
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "es_mental_health_counseling"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Intuit-GenSRF/es_counsel_chat | 2023-10-03T19:22:24.000Z | [
"region:us"
] | Intuit-GenSRF | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: questionID
dtype: int64
- name: questionTitle
dtype: string
- name: questionText
dtype: string
- name: questionLink
dtype: string
- name: topic
dtype: string
- name: therapistInfo
dtype: string
- name: therapistURL
dtype: string
- name: answerText
dtype: string
- name: upvotes
dtype: int64
- name: views
dtype: int64
- name: split
dtype: string
- name: text
dtype: string
- name: text_spanish
dtype: string
splits:
- name: train
num_bytes: 10490383
num_examples: 2612
download_size: 5137621
dataset_size: 10490383
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "es_counsel_chat"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_PulsarAI__EnsembleV5-Nova-13B | 2023-10-03T19:24:24.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of PulsarAI/EnsembleV5-Nova-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [PulsarAI/EnsembleV5-Nova-13B](https://huggingface.co/PulsarAI/EnsembleV5-Nova-13B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PulsarAI__EnsembleV5-Nova-13B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-03T19:22:59.151966](https://huggingface.co/datasets/open-llm-leaderboard/details_PulsarAI__EnsembleV5-Nova-13B/blob/main/results_2023-10-03T19-22-59.151966.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5689777286407749,\n\
\ \"acc_stderr\": 0.03448372173215078,\n \"acc_norm\": 0.5732553402735832,\n\
\ \"acc_norm_stderr\": 0.03446082061672094,\n \"mc1\": 0.3427172582619339,\n\
\ \"mc1_stderr\": 0.016614949385347036,\n \"mc2\": 0.4985854685041301,\n\
\ \"mc2_stderr\": 0.015160720709708817\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5784982935153583,\n \"acc_stderr\": 0.014430197069326023,\n\
\ \"acc_norm\": 0.6271331058020477,\n \"acc_norm_stderr\": 0.01413117676013117\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6217884883489345,\n\
\ \"acc_stderr\": 0.004839497020536615,\n \"acc_norm\": 0.8255327623979287,\n\
\ \"acc_norm_stderr\": 0.0037873515193708063\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5789473684210527,\n \"acc_stderr\": 0.04017901275981749,\n\
\ \"acc_norm\": 0.5789473684210527,\n \"acc_norm_stderr\": 0.04017901275981749\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6037735849056604,\n \"acc_stderr\": 0.030102793781791197,\n\
\ \"acc_norm\": 0.6037735849056604,\n \"acc_norm_stderr\": 0.030102793781791197\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6527777777777778,\n\
\ \"acc_stderr\": 0.039812405437178615,\n \"acc_norm\": 0.6527777777777778,\n\
\ \"acc_norm_stderr\": 0.039812405437178615\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5433526011560693,\n\
\ \"acc_stderr\": 0.03798106566014498,\n \"acc_norm\": 0.5433526011560693,\n\
\ \"acc_norm_stderr\": 0.03798106566014498\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n\
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.46382978723404256,\n \"acc_stderr\": 0.03260038511835771,\n\
\ \"acc_norm\": 0.46382978723404256,\n \"acc_norm_stderr\": 0.03260038511835771\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n\
\ \"acc_stderr\": 0.04339138322579861,\n \"acc_norm\": 0.30701754385964913,\n\
\ \"acc_norm_stderr\": 0.04339138322579861\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4413793103448276,\n \"acc_stderr\": 0.04137931034482758,\n\
\ \"acc_norm\": 0.4413793103448276,\n \"acc_norm_stderr\": 0.04137931034482758\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3783068783068783,\n \"acc_stderr\": 0.024976954053155243,\n \"\
acc_norm\": 0.3783068783068783,\n \"acc_norm_stderr\": 0.024976954053155243\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n\
\ \"acc_stderr\": 0.04390259265377562,\n \"acc_norm\": 0.40476190476190477,\n\
\ \"acc_norm_stderr\": 0.04390259265377562\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6580645161290323,\n\
\ \"acc_stderr\": 0.02698528957655274,\n \"acc_norm\": 0.6580645161290323,\n\
\ \"acc_norm_stderr\": 0.02698528957655274\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.43349753694581283,\n \"acc_stderr\": 0.03486731727419872,\n\
\ \"acc_norm\": 0.43349753694581283,\n \"acc_norm_stderr\": 0.03486731727419872\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\"\
: 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7212121212121212,\n \"acc_stderr\": 0.035014387062967806,\n\
\ \"acc_norm\": 0.7212121212121212,\n \"acc_norm_stderr\": 0.035014387062967806\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7626262626262627,\n \"acc_stderr\": 0.030313710538198913,\n \"\
acc_norm\": 0.7626262626262627,\n \"acc_norm_stderr\": 0.030313710538198913\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8393782383419689,\n \"acc_stderr\": 0.02649905770139746,\n\
\ \"acc_norm\": 0.8393782383419689,\n \"acc_norm_stderr\": 0.02649905770139746\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6153846153846154,\n \"acc_stderr\": 0.02466674491518722,\n \
\ \"acc_norm\": 0.6153846153846154,\n \"acc_norm_stderr\": 0.02466674491518722\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3296296296296296,\n \"acc_stderr\": 0.028661201116524586,\n \
\ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.028661201116524586\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6302521008403361,\n \"acc_stderr\": 0.03135709599613591,\n \
\ \"acc_norm\": 0.6302521008403361,\n \"acc_norm_stderr\": 0.03135709599613591\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526733,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526733\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8,\n \"acc_stderr\": 0.017149858514250955,\n \"acc_norm\": 0.8,\n\
\ \"acc_norm_stderr\": 0.017149858514250955\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n\
\ \"acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8137254901960784,\n \"acc_stderr\": 0.02732547096671632,\n \"\
acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.02732547096671632\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229962,\n \
\ \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229962\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6278026905829597,\n\
\ \"acc_stderr\": 0.03244305283008731,\n \"acc_norm\": 0.6278026905829597,\n\
\ \"acc_norm_stderr\": 0.03244305283008731\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6335877862595419,\n \"acc_stderr\": 0.04225875451969637,\n\
\ \"acc_norm\": 0.6335877862595419,\n \"acc_norm_stderr\": 0.04225875451969637\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6528925619834711,\n \"acc_stderr\": 0.043457245702925335,\n \"\
acc_norm\": 0.6528925619834711,\n \"acc_norm_stderr\": 0.043457245702925335\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6203703703703703,\n\
\ \"acc_stderr\": 0.04691521224077742,\n \"acc_norm\": 0.6203703703703703,\n\
\ \"acc_norm_stderr\": 0.04691521224077742\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6809815950920245,\n \"acc_stderr\": 0.03661997551073836,\n\
\ \"acc_norm\": 0.6809815950920245,\n \"acc_norm_stderr\": 0.03661997551073836\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.32142857142857145,\n\
\ \"acc_stderr\": 0.044328040552915185,\n \"acc_norm\": 0.32142857142857145,\n\
\ \"acc_norm_stderr\": 0.044328040552915185\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7991452991452992,\n\
\ \"acc_stderr\": 0.02624677294689048,\n \"acc_norm\": 0.7991452991452992,\n\
\ \"acc_norm_stderr\": 0.02624677294689048\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7586206896551724,\n\
\ \"acc_stderr\": 0.01530238012354209,\n \"acc_norm\": 0.7586206896551724,\n\
\ \"acc_norm_stderr\": 0.01530238012354209\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6271676300578035,\n \"acc_stderr\": 0.02603389061357628,\n\
\ \"acc_norm\": 0.6271676300578035,\n \"acc_norm_stderr\": 0.02603389061357628\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3843575418994413,\n\
\ \"acc_stderr\": 0.016269088663959402,\n \"acc_norm\": 0.3843575418994413,\n\
\ \"acc_norm_stderr\": 0.016269088663959402\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6045751633986928,\n \"acc_stderr\": 0.027996723180631445,\n\
\ \"acc_norm\": 0.6045751633986928,\n \"acc_norm_stderr\": 0.027996723180631445\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6463022508038585,\n\
\ \"acc_stderr\": 0.027155208103200865,\n \"acc_norm\": 0.6463022508038585,\n\
\ \"acc_norm_stderr\": 0.027155208103200865\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6419753086419753,\n \"acc_stderr\": 0.026675611926037086,\n\
\ \"acc_norm\": 0.6419753086419753,\n \"acc_norm_stderr\": 0.026675611926037086\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4432624113475177,\n \"acc_stderr\": 0.029634838473766002,\n \
\ \"acc_norm\": 0.4432624113475177,\n \"acc_norm_stderr\": 0.029634838473766002\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4439374185136897,\n\
\ \"acc_stderr\": 0.012689708167787682,\n \"acc_norm\": 0.4439374185136897,\n\
\ \"acc_norm_stderr\": 0.012689708167787682\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5919117647058824,\n \"acc_stderr\": 0.029855261393483924,\n\
\ \"acc_norm\": 0.5919117647058824,\n \"acc_norm_stderr\": 0.029855261393483924\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5751633986928104,\n \"acc_stderr\": 0.01999797303545833,\n \
\ \"acc_norm\": 0.5751633986928104,\n \"acc_norm_stderr\": 0.01999797303545833\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.04582004841505417,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.04582004841505417\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5714285714285714,\n \"acc_stderr\": 0.031680911612338825,\n\
\ \"acc_norm\": 0.5714285714285714,\n \"acc_norm_stderr\": 0.031680911612338825\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6766169154228856,\n\
\ \"acc_stderr\": 0.03307615947979033,\n \"acc_norm\": 0.6766169154228856,\n\
\ \"acc_norm_stderr\": 0.03307615947979033\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4457831325301205,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.4457831325301205,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7660818713450293,\n \"acc_stderr\": 0.03246721765117826,\n\
\ \"acc_norm\": 0.7660818713450293,\n \"acc_norm_stderr\": 0.03246721765117826\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3427172582619339,\n\
\ \"mc1_stderr\": 0.016614949385347036,\n \"mc2\": 0.4985854685041301,\n\
\ \"mc2_stderr\": 0.015160720709708817\n }\n}\n```"
repo_url: https://huggingface.co/PulsarAI/EnsembleV5-Nova-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_03T19_22_59.151966
path:
- '**/details_harness|arc:challenge|25_2023-10-03T19-22-59.151966.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-03T19-22-59.151966.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_03T19_22_59.151966
path:
- '**/details_harness|hellaswag|10_2023-10-03T19-22-59.151966.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-03T19-22-59.151966.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_03T19_22_59.151966
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T19-22-59.151966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T19-22-59.151966.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T19-22-59.151966.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_03T19_22_59.151966
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T19-22-59.151966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T19-22-59.151966.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_03T19_22_59.151966
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T19-22-59.151966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T19-22-59.151966.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_03T19_22_59.151966
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T19-22-59.151966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T19-22-59.151966.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_03T19_22_59.151966
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T19-22-59.151966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T19-22-59.151966.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_03T19_22_59.151966
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T19-22-59.151966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T19-22-59.151966.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_03T19_22_59.151966
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T19-22-59.151966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T19-22-59.151966.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_03T19_22_59.151966
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T19-22-59.151966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T19-22-59.151966.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_03T19_22_59.151966
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T19-22-59.151966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T19-22-59.151966.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_03T19_22_59.151966
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T19-22-59.151966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T19-22-59.151966.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_03T19_22_59.151966
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T19-22-59.151966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T19-22-59.151966.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_03T19_22_59.151966
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T19-22-59.151966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T19-22-59.151966.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_03T19_22_59.151966
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T19-22-59.151966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T19-22-59.151966.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_03T19_22_59.151966
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T19-22-59.151966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T19-22-59.151966.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_03T19_22_59.151966
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T19-22-59.151966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T19-22-59.151966.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_03T19_22_59.151966
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T19-22-59.151966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T19-22-59.151966.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_03T19_22_59.151966
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T19-22-59.151966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T19-22-59.151966.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_03T19_22_59.151966
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T19-22-59.151966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T19-22-59.151966.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_03T19_22_59.151966
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T19-22-59.151966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T19-22-59.151966.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_03T19_22_59.151966
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T19-22-59.151966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T19-22-59.151966.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_03T19_22_59.151966
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T19-22-59.151966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T19-22-59.151966.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_03T19_22_59.151966
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T19-22-59.151966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T19-22-59.151966.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_03T19_22_59.151966
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T19-22-59.151966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T19-22-59.151966.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_03T19_22_59.151966
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T19-22-59.151966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T19-22-59.151966.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_03T19_22_59.151966
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T19-22-59.151966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T19-22-59.151966.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_03T19_22_59.151966
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T19-22-59.151966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T19-22-59.151966.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_03T19_22_59.151966
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T19-22-59.151966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T19-22-59.151966.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_03T19_22_59.151966
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T19-22-59.151966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T19-22-59.151966.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_03T19_22_59.151966
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T19-22-59.151966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T19-22-59.151966.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_03T19_22_59.151966
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T19-22-59.151966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T19-22-59.151966.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_03T19_22_59.151966
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T19-22-59.151966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T19-22-59.151966.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_03T19_22_59.151966
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T19-22-59.151966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T19-22-59.151966.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_03T19_22_59.151966
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T19-22-59.151966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T19-22-59.151966.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_03T19_22_59.151966
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T19-22-59.151966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T19-22-59.151966.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_03T19_22_59.151966
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T19-22-59.151966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T19-22-59.151966.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_03T19_22_59.151966
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T19-22-59.151966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T19-22-59.151966.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_03T19_22_59.151966
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T19-22-59.151966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T19-22-59.151966.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_03T19_22_59.151966
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T19-22-59.151966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T19-22-59.151966.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_03T19_22_59.151966
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T19-22-59.151966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T19-22-59.151966.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_03T19_22_59.151966
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T19-22-59.151966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T19-22-59.151966.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_03T19_22_59.151966
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T19-22-59.151966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T19-22-59.151966.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_03T19_22_59.151966
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T19-22-59.151966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T19-22-59.151966.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_03T19_22_59.151966
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T19-22-59.151966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T19-22-59.151966.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_03T19_22_59.151966
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T19-22-59.151966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T19-22-59.151966.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_03T19_22_59.151966
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T19-22-59.151966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T19-22-59.151966.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_03T19_22_59.151966
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T19-22-59.151966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T19-22-59.151966.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_03T19_22_59.151966
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T19-22-59.151966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T19-22-59.151966.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_03T19_22_59.151966
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T19-22-59.151966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T19-22-59.151966.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_03T19_22_59.151966
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T19-22-59.151966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T19-22-59.151966.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_03T19_22_59.151966
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T19-22-59.151966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T19-22-59.151966.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_03T19_22_59.151966
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T19-22-59.151966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T19-22-59.151966.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_03T19_22_59.151966
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T19-22-59.151966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T19-22-59.151966.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_03T19_22_59.151966
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T19-22-59.151966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T19-22-59.151966.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_03T19_22_59.151966
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T19-22-59.151966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T19-22-59.151966.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_03T19_22_59.151966
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T19-22-59.151966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T19-22-59.151966.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_03T19_22_59.151966
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T19-22-59.151966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T19-22-59.151966.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_03T19_22_59.151966
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T19-22-59.151966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T19-22-59.151966.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_03T19_22_59.151966
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T19-22-59.151966.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T19-22-59.151966.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_03T19_22_59.151966
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T19-22-59.151966.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T19-22-59.151966.parquet'
- config_name: results
data_files:
- split: 2023_10_03T19_22_59.151966
path:
- results_2023-10-03T19-22-59.151966.parquet
- split: latest
path:
- results_2023-10-03T19-22-59.151966.parquet
---
# Dataset Card for Evaluation run of PulsarAI/EnsembleV5-Nova-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/PulsarAI/EnsembleV5-Nova-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [PulsarAI/EnsembleV5-Nova-13B](https://huggingface.co/PulsarAI/EnsembleV5-Nova-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_PulsarAI__EnsembleV5-Nova-13B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-03T19:22:59.151966](https://huggingface.co/datasets/open-llm-leaderboard/details_PulsarAI__EnsembleV5-Nova-13B/blob/main/results_2023-10-03T19-22-59.151966.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5689777286407749,
"acc_stderr": 0.03448372173215078,
"acc_norm": 0.5732553402735832,
"acc_norm_stderr": 0.03446082061672094,
"mc1": 0.3427172582619339,
"mc1_stderr": 0.016614949385347036,
"mc2": 0.4985854685041301,
"mc2_stderr": 0.015160720709708817
},
"harness|arc:challenge|25": {
"acc": 0.5784982935153583,
"acc_stderr": 0.014430197069326023,
"acc_norm": 0.6271331058020477,
"acc_norm_stderr": 0.01413117676013117
},
"harness|hellaswag|10": {
"acc": 0.6217884883489345,
"acc_stderr": 0.004839497020536615,
"acc_norm": 0.8255327623979287,
"acc_norm_stderr": 0.0037873515193708063
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5789473684210527,
"acc_stderr": 0.04017901275981749,
"acc_norm": 0.5789473684210527,
"acc_norm_stderr": 0.04017901275981749
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6037735849056604,
"acc_stderr": 0.030102793781791197,
"acc_norm": 0.6037735849056604,
"acc_norm_stderr": 0.030102793781791197
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6527777777777778,
"acc_stderr": 0.039812405437178615,
"acc_norm": 0.6527777777777778,
"acc_norm_stderr": 0.039812405437178615
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5433526011560693,
"acc_stderr": 0.03798106566014498,
"acc_norm": 0.5433526011560693,
"acc_norm_stderr": 0.03798106566014498
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.46382978723404256,
"acc_stderr": 0.03260038511835771,
"acc_norm": 0.46382978723404256,
"acc_norm_stderr": 0.03260038511835771
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.04339138322579861,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.04339138322579861
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4413793103448276,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.4413793103448276,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3783068783068783,
"acc_stderr": 0.024976954053155243,
"acc_norm": 0.3783068783068783,
"acc_norm_stderr": 0.024976954053155243
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377562,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377562
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6580645161290323,
"acc_stderr": 0.02698528957655274,
"acc_norm": 0.6580645161290323,
"acc_norm_stderr": 0.02698528957655274
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.43349753694581283,
"acc_stderr": 0.03486731727419872,
"acc_norm": 0.43349753694581283,
"acc_norm_stderr": 0.03486731727419872
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7212121212121212,
"acc_stderr": 0.035014387062967806,
"acc_norm": 0.7212121212121212,
"acc_norm_stderr": 0.035014387062967806
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7626262626262627,
"acc_stderr": 0.030313710538198913,
"acc_norm": 0.7626262626262627,
"acc_norm_stderr": 0.030313710538198913
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8393782383419689,
"acc_stderr": 0.02649905770139746,
"acc_norm": 0.8393782383419689,
"acc_norm_stderr": 0.02649905770139746
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6153846153846154,
"acc_stderr": 0.02466674491518722,
"acc_norm": 0.6153846153846154,
"acc_norm_stderr": 0.02466674491518722
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.028661201116524586,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.028661201116524586
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6302521008403361,
"acc_stderr": 0.03135709599613591,
"acc_norm": 0.6302521008403361,
"acc_norm_stderr": 0.03135709599613591
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526733,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526733
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8,
"acc_stderr": 0.017149858514250955,
"acc_norm": 0.8,
"acc_norm_stderr": 0.017149858514250955
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.02732547096671632,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.02732547096671632
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7763713080168776,
"acc_stderr": 0.027123298205229962,
"acc_norm": 0.7763713080168776,
"acc_norm_stderr": 0.027123298205229962
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6278026905829597,
"acc_stderr": 0.03244305283008731,
"acc_norm": 0.6278026905829597,
"acc_norm_stderr": 0.03244305283008731
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6335877862595419,
"acc_stderr": 0.04225875451969637,
"acc_norm": 0.6335877862595419,
"acc_norm_stderr": 0.04225875451969637
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6528925619834711,
"acc_stderr": 0.043457245702925335,
"acc_norm": 0.6528925619834711,
"acc_norm_stderr": 0.043457245702925335
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6203703703703703,
"acc_stderr": 0.04691521224077742,
"acc_norm": 0.6203703703703703,
"acc_norm_stderr": 0.04691521224077742
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6809815950920245,
"acc_stderr": 0.03661997551073836,
"acc_norm": 0.6809815950920245,
"acc_norm_stderr": 0.03661997551073836
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.32142857142857145,
"acc_stderr": 0.044328040552915185,
"acc_norm": 0.32142857142857145,
"acc_norm_stderr": 0.044328040552915185
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7991452991452992,
"acc_stderr": 0.02624677294689048,
"acc_norm": 0.7991452991452992,
"acc_norm_stderr": 0.02624677294689048
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7586206896551724,
"acc_stderr": 0.01530238012354209,
"acc_norm": 0.7586206896551724,
"acc_norm_stderr": 0.01530238012354209
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6271676300578035,
"acc_stderr": 0.02603389061357628,
"acc_norm": 0.6271676300578035,
"acc_norm_stderr": 0.02603389061357628
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3843575418994413,
"acc_stderr": 0.016269088663959402,
"acc_norm": 0.3843575418994413,
"acc_norm_stderr": 0.016269088663959402
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6045751633986928,
"acc_stderr": 0.027996723180631445,
"acc_norm": 0.6045751633986928,
"acc_norm_stderr": 0.027996723180631445
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6463022508038585,
"acc_stderr": 0.027155208103200865,
"acc_norm": 0.6463022508038585,
"acc_norm_stderr": 0.027155208103200865
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6419753086419753,
"acc_stderr": 0.026675611926037086,
"acc_norm": 0.6419753086419753,
"acc_norm_stderr": 0.026675611926037086
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4432624113475177,
"acc_stderr": 0.029634838473766002,
"acc_norm": 0.4432624113475177,
"acc_norm_stderr": 0.029634838473766002
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4439374185136897,
"acc_stderr": 0.012689708167787682,
"acc_norm": 0.4439374185136897,
"acc_norm_stderr": 0.012689708167787682
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5919117647058824,
"acc_stderr": 0.029855261393483924,
"acc_norm": 0.5919117647058824,
"acc_norm_stderr": 0.029855261393483924
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5751633986928104,
"acc_stderr": 0.01999797303545833,
"acc_norm": 0.5751633986928104,
"acc_norm_stderr": 0.01999797303545833
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.04582004841505417,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.04582004841505417
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.031680911612338825,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.031680911612338825
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6766169154228856,
"acc_stderr": 0.03307615947979033,
"acc_norm": 0.6766169154228856,
"acc_norm_stderr": 0.03307615947979033
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4457831325301205,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.4457831325301205,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7660818713450293,
"acc_stderr": 0.03246721765117826,
"acc_norm": 0.7660818713450293,
"acc_norm_stderr": 0.03246721765117826
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3427172582619339,
"mc1_stderr": 0.016614949385347036,
"mc2": 0.4985854685041301,
"mc2_stderr": 0.015160720709708817
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_dhmeltzer__Llama-2-7b-hf-eli5-cleaned-wiki65k-1024_qlora_merged | 2023-10-03T19:25:01.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of dhmeltzer/Llama-2-7b-hf-eli5-cleaned-wiki65k-1024_qlora_merged
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [dhmeltzer/Llama-2-7b-hf-eli5-cleaned-wiki65k-1024_qlora_merged](https://huggingface.co/dhmeltzer/Llama-2-7b-hf-eli5-cleaned-wiki65k-1024_qlora_merged)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_dhmeltzer__Llama-2-7b-hf-eli5-cleaned-wiki65k-1024_qlora_merged\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-03T19:23:38.548591](https://huggingface.co/datasets/open-llm-leaderboard/details_dhmeltzer__Llama-2-7b-hf-eli5-cleaned-wiki65k-1024_qlora_merged/blob/main/results_2023-10-03T19-23-38.548591.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4589360172299559,\n\
\ \"acc_stderr\": 0.035095035290907385,\n \"acc_norm\": 0.46314201145146056,\n\
\ \"acc_norm_stderr\": 0.03508099078679357,\n \"mc1\": 0.27539779681762544,\n\
\ \"mc1_stderr\": 0.01563813566777552,\n \"mc2\": 0.4171587207764145,\n\
\ \"mc2_stderr\": 0.013880088765252275\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4872013651877133,\n \"acc_stderr\": 0.014606603181012541,\n\
\ \"acc_norm\": 0.5366894197952219,\n \"acc_norm_stderr\": 0.014572000527756993\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5822545309699263,\n\
\ \"acc_stderr\": 0.004921798492608778,\n \"acc_norm\": 0.780920135431189,\n\
\ \"acc_norm_stderr\": 0.004127775403148707\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847415,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847415\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45185185185185184,\n\
\ \"acc_stderr\": 0.04299268905480864,\n \"acc_norm\": 0.45185185185185184,\n\
\ \"acc_norm_stderr\": 0.04299268905480864\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4473684210526316,\n \"acc_stderr\": 0.04046336883978251,\n\
\ \"acc_norm\": 0.4473684210526316,\n \"acc_norm_stderr\": 0.04046336883978251\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.44,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.46037735849056605,\n \"acc_stderr\": 0.030676096599389184,\n\
\ \"acc_norm\": 0.46037735849056605,\n \"acc_norm_stderr\": 0.030676096599389184\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4652777777777778,\n\
\ \"acc_stderr\": 0.04171115858181618,\n \"acc_norm\": 0.4652777777777778,\n\
\ \"acc_norm_stderr\": 0.04171115858181618\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n\
\ \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3988439306358382,\n\
\ \"acc_stderr\": 0.03733626655383509,\n \"acc_norm\": 0.3988439306358382,\n\
\ \"acc_norm_stderr\": 0.03733626655383509\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.04280105837364395,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.04280105837364395\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n\
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.03202563076101735,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.03202563076101735\n },\n\
\ \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.042270544512322004,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.042270544512322004\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.46206896551724136,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.46206896551724136,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.25925925925925924,\n \"acc_stderr\": 0.022569897074918407,\n \"\
acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.022569897074918407\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n\
\ \"acc_stderr\": 0.04306241259127155,\n \"acc_norm\": 0.36507936507936506,\n\
\ \"acc_norm_stderr\": 0.04306241259127155\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.4935483870967742,\n\
\ \"acc_stderr\": 0.02844163823354051,\n \"acc_norm\": 0.4935483870967742,\n\
\ \"acc_norm_stderr\": 0.02844163823354051\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2955665024630542,\n \"acc_stderr\": 0.03210494433751458,\n\
\ \"acc_norm\": 0.2955665024630542,\n \"acc_norm_stderr\": 0.03210494433751458\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\"\
: 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6060606060606061,\n \"acc_stderr\": 0.038154943086889305,\n\
\ \"acc_norm\": 0.6060606060606061,\n \"acc_norm_stderr\": 0.038154943086889305\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.4898989898989899,\n \"acc_stderr\": 0.035616254886737454,\n \"\
acc_norm\": 0.4898989898989899,\n \"acc_norm_stderr\": 0.035616254886737454\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6580310880829016,\n \"acc_stderr\": 0.03423465100104284,\n\
\ \"acc_norm\": 0.6580310880829016,\n \"acc_norm_stderr\": 0.03423465100104284\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.43333333333333335,\n \"acc_stderr\": 0.02512465352588513,\n\
\ \"acc_norm\": 0.43333333333333335,\n \"acc_norm_stderr\": 0.02512465352588513\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.27037037037037037,\n \"acc_stderr\": 0.027080372815145647,\n \
\ \"acc_norm\": 0.27037037037037037,\n \"acc_norm_stderr\": 0.027080372815145647\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.4495798319327731,\n \"acc_stderr\": 0.03231293497137707,\n \
\ \"acc_norm\": 0.4495798319327731,\n \"acc_norm_stderr\": 0.03231293497137707\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.03734535676787198,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.03734535676787198\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6073394495412844,\n \"acc_stderr\": 0.020937505161201093,\n \"\
acc_norm\": 0.6073394495412844,\n \"acc_norm_stderr\": 0.020937505161201093\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.20833333333333334,\n \"acc_stderr\": 0.027696910713093936,\n \"\
acc_norm\": 0.20833333333333334,\n \"acc_norm_stderr\": 0.027696910713093936\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.03509312031717982,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.03509312031717982\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.5991561181434599,\n \"acc_stderr\": 0.03190080389473235,\n\
\ \"acc_norm\": 0.5991561181434599,\n \"acc_norm_stderr\": 0.03190080389473235\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.547085201793722,\n\
\ \"acc_stderr\": 0.03340867501923324,\n \"acc_norm\": 0.547085201793722,\n\
\ \"acc_norm_stderr\": 0.03340867501923324\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6106870229007634,\n \"acc_stderr\": 0.04276486542814591,\n\
\ \"acc_norm\": 0.6106870229007634,\n \"acc_norm_stderr\": 0.04276486542814591\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6446280991735537,\n \"acc_stderr\": 0.0436923632657398,\n \"acc_norm\"\
: 0.6446280991735537,\n \"acc_norm_stderr\": 0.0436923632657398\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5370370370370371,\n\
\ \"acc_stderr\": 0.04820403072760628,\n \"acc_norm\": 0.5370370370370371,\n\
\ \"acc_norm_stderr\": 0.04820403072760628\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5030674846625767,\n \"acc_stderr\": 0.03928297078179663,\n\
\ \"acc_norm\": 0.5030674846625767,\n \"acc_norm_stderr\": 0.03928297078179663\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n\
\ \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \
\ \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5728155339805825,\n \"acc_stderr\": 0.04897957737781168,\n\
\ \"acc_norm\": 0.5728155339805825,\n \"acc_norm_stderr\": 0.04897957737781168\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6794871794871795,\n\
\ \"acc_stderr\": 0.03057281131029961,\n \"acc_norm\": 0.6794871794871795,\n\
\ \"acc_norm_stderr\": 0.03057281131029961\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6360153256704981,\n\
\ \"acc_stderr\": 0.017205684809032232,\n \"acc_norm\": 0.6360153256704981,\n\
\ \"acc_norm_stderr\": 0.017205684809032232\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.523121387283237,\n \"acc_stderr\": 0.026890297881303118,\n\
\ \"acc_norm\": 0.523121387283237,\n \"acc_norm_stderr\": 0.026890297881303118\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.02849199358617156,\n\
\ \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.02849199358617156\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5980707395498392,\n\
\ \"acc_stderr\": 0.027846476005930473,\n \"acc_norm\": 0.5980707395498392,\n\
\ \"acc_norm_stderr\": 0.027846476005930473\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.4691358024691358,\n \"acc_stderr\": 0.027767689606833935,\n\
\ \"acc_norm\": 0.4691358024691358,\n \"acc_norm_stderr\": 0.027767689606833935\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.35106382978723405,\n \"acc_stderr\": 0.028473501272963764,\n \
\ \"acc_norm\": 0.35106382978723405,\n \"acc_norm_stderr\": 0.028473501272963764\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.35658409387222945,\n\
\ \"acc_stderr\": 0.012233642989273891,\n \"acc_norm\": 0.35658409387222945,\n\
\ \"acc_norm_stderr\": 0.012233642989273891\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4963235294117647,\n \"acc_stderr\": 0.0303720158854282,\n\
\ \"acc_norm\": 0.4963235294117647,\n \"acc_norm_stderr\": 0.0303720158854282\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4411764705882353,\n \"acc_stderr\": 0.020087362076702853,\n \
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.020087362076702853\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5363636363636364,\n\
\ \"acc_stderr\": 0.04776449162396197,\n \"acc_norm\": 0.5363636363636364,\n\
\ \"acc_norm_stderr\": 0.04776449162396197\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.47346938775510206,\n \"acc_stderr\": 0.03196412734523272,\n\
\ \"acc_norm\": 0.47346938775510206,\n \"acc_norm_stderr\": 0.03196412734523272\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6069651741293532,\n\
\ \"acc_stderr\": 0.0345368246603156,\n \"acc_norm\": 0.6069651741293532,\n\
\ \"acc_norm_stderr\": 0.0345368246603156\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \
\ \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.40963855421686746,\n\
\ \"acc_stderr\": 0.03828401115079022,\n \"acc_norm\": 0.40963855421686746,\n\
\ \"acc_norm_stderr\": 0.03828401115079022\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.695906432748538,\n \"acc_stderr\": 0.035282112582452306,\n\
\ \"acc_norm\": 0.695906432748538,\n \"acc_norm_stderr\": 0.035282112582452306\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.27539779681762544,\n\
\ \"mc1_stderr\": 0.01563813566777552,\n \"mc2\": 0.4171587207764145,\n\
\ \"mc2_stderr\": 0.013880088765252275\n }\n}\n```"
repo_url: https://huggingface.co/dhmeltzer/Llama-2-7b-hf-eli5-cleaned-wiki65k-1024_qlora_merged
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_03T19_23_38.548591
path:
- '**/details_harness|arc:challenge|25_2023-10-03T19-23-38.548591.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-03T19-23-38.548591.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_03T19_23_38.548591
path:
- '**/details_harness|hellaswag|10_2023-10-03T19-23-38.548591.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-03T19-23-38.548591.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_03T19_23_38.548591
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T19-23-38.548591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T19-23-38.548591.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T19-23-38.548591.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_03T19_23_38.548591
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T19-23-38.548591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T19-23-38.548591.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_03T19_23_38.548591
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T19-23-38.548591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T19-23-38.548591.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_03T19_23_38.548591
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T19-23-38.548591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T19-23-38.548591.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_03T19_23_38.548591
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T19-23-38.548591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T19-23-38.548591.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_03T19_23_38.548591
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T19-23-38.548591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T19-23-38.548591.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_03T19_23_38.548591
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T19-23-38.548591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T19-23-38.548591.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_03T19_23_38.548591
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T19-23-38.548591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T19-23-38.548591.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_03T19_23_38.548591
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T19-23-38.548591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T19-23-38.548591.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_03T19_23_38.548591
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T19-23-38.548591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T19-23-38.548591.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_03T19_23_38.548591
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T19-23-38.548591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T19-23-38.548591.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_03T19_23_38.548591
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T19-23-38.548591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T19-23-38.548591.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_03T19_23_38.548591
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T19-23-38.548591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T19-23-38.548591.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_03T19_23_38.548591
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T19-23-38.548591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T19-23-38.548591.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_03T19_23_38.548591
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T19-23-38.548591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T19-23-38.548591.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_03T19_23_38.548591
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T19-23-38.548591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T19-23-38.548591.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_03T19_23_38.548591
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T19-23-38.548591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T19-23-38.548591.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_03T19_23_38.548591
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T19-23-38.548591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T19-23-38.548591.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_03T19_23_38.548591
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T19-23-38.548591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T19-23-38.548591.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_03T19_23_38.548591
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T19-23-38.548591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T19-23-38.548591.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_03T19_23_38.548591
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T19-23-38.548591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T19-23-38.548591.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_03T19_23_38.548591
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T19-23-38.548591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T19-23-38.548591.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_03T19_23_38.548591
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T19-23-38.548591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T19-23-38.548591.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_03T19_23_38.548591
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T19-23-38.548591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T19-23-38.548591.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_03T19_23_38.548591
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T19-23-38.548591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T19-23-38.548591.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_03T19_23_38.548591
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T19-23-38.548591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T19-23-38.548591.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_03T19_23_38.548591
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T19-23-38.548591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T19-23-38.548591.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_03T19_23_38.548591
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T19-23-38.548591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T19-23-38.548591.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_03T19_23_38.548591
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T19-23-38.548591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T19-23-38.548591.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_03T19_23_38.548591
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T19-23-38.548591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T19-23-38.548591.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_03T19_23_38.548591
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T19-23-38.548591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T19-23-38.548591.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_03T19_23_38.548591
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T19-23-38.548591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T19-23-38.548591.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_03T19_23_38.548591
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T19-23-38.548591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T19-23-38.548591.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_03T19_23_38.548591
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T19-23-38.548591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T19-23-38.548591.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_03T19_23_38.548591
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T19-23-38.548591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T19-23-38.548591.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_03T19_23_38.548591
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T19-23-38.548591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T19-23-38.548591.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_03T19_23_38.548591
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T19-23-38.548591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T19-23-38.548591.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_03T19_23_38.548591
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T19-23-38.548591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T19-23-38.548591.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_03T19_23_38.548591
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T19-23-38.548591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T19-23-38.548591.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_03T19_23_38.548591
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T19-23-38.548591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T19-23-38.548591.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_03T19_23_38.548591
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T19-23-38.548591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T19-23-38.548591.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_03T19_23_38.548591
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T19-23-38.548591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T19-23-38.548591.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_03T19_23_38.548591
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T19-23-38.548591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T19-23-38.548591.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_03T19_23_38.548591
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T19-23-38.548591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T19-23-38.548591.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_03T19_23_38.548591
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T19-23-38.548591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T19-23-38.548591.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_03T19_23_38.548591
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T19-23-38.548591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T19-23-38.548591.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_03T19_23_38.548591
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T19-23-38.548591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T19-23-38.548591.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_03T19_23_38.548591
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T19-23-38.548591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T19-23-38.548591.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_03T19_23_38.548591
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T19-23-38.548591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T19-23-38.548591.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_03T19_23_38.548591
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T19-23-38.548591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T19-23-38.548591.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_03T19_23_38.548591
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T19-23-38.548591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T19-23-38.548591.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_03T19_23_38.548591
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T19-23-38.548591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T19-23-38.548591.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_03T19_23_38.548591
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T19-23-38.548591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T19-23-38.548591.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_03T19_23_38.548591
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T19-23-38.548591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T19-23-38.548591.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_03T19_23_38.548591
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T19-23-38.548591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T19-23-38.548591.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_03T19_23_38.548591
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T19-23-38.548591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T19-23-38.548591.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_03T19_23_38.548591
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T19-23-38.548591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T19-23-38.548591.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_03T19_23_38.548591
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T19-23-38.548591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T19-23-38.548591.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_03T19_23_38.548591
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T19-23-38.548591.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T19-23-38.548591.parquet'
- config_name: results
data_files:
- split: 2023_10_03T19_23_38.548591
path:
- results_2023-10-03T19-23-38.548591.parquet
- split: latest
path:
- results_2023-10-03T19-23-38.548591.parquet
---
# Dataset Card for Evaluation run of dhmeltzer/Llama-2-7b-hf-eli5-cleaned-wiki65k-1024_qlora_merged
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/dhmeltzer/Llama-2-7b-hf-eli5-cleaned-wiki65k-1024_qlora_merged
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [dhmeltzer/Llama-2-7b-hf-eli5-cleaned-wiki65k-1024_qlora_merged](https://huggingface.co/dhmeltzer/Llama-2-7b-hf-eli5-cleaned-wiki65k-1024_qlora_merged) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_dhmeltzer__Llama-2-7b-hf-eli5-cleaned-wiki65k-1024_qlora_merged",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-03T19:23:38.548591](https://huggingface.co/datasets/open-llm-leaderboard/details_dhmeltzer__Llama-2-7b-hf-eli5-cleaned-wiki65k-1024_qlora_merged/blob/main/results_2023-10-03T19-23-38.548591.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4589360172299559,
"acc_stderr": 0.035095035290907385,
"acc_norm": 0.46314201145146056,
"acc_norm_stderr": 0.03508099078679357,
"mc1": 0.27539779681762544,
"mc1_stderr": 0.01563813566777552,
"mc2": 0.4171587207764145,
"mc2_stderr": 0.013880088765252275
},
"harness|arc:challenge|25": {
"acc": 0.4872013651877133,
"acc_stderr": 0.014606603181012541,
"acc_norm": 0.5366894197952219,
"acc_norm_stderr": 0.014572000527756993
},
"harness|hellaswag|10": {
"acc": 0.5822545309699263,
"acc_stderr": 0.004921798492608778,
"acc_norm": 0.780920135431189,
"acc_norm_stderr": 0.004127775403148707
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847415,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847415
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45185185185185184,
"acc_stderr": 0.04299268905480864,
"acc_norm": 0.45185185185185184,
"acc_norm_stderr": 0.04299268905480864
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.04046336883978251,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.04046336883978251
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.46037735849056605,
"acc_stderr": 0.030676096599389184,
"acc_norm": 0.46037735849056605,
"acc_norm_stderr": 0.030676096599389184
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4652777777777778,
"acc_stderr": 0.04171115858181618,
"acc_norm": 0.4652777777777778,
"acc_norm_stderr": 0.04171115858181618
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3988439306358382,
"acc_stderr": 0.03733626655383509,
"acc_norm": 0.3988439306358382,
"acc_norm_stderr": 0.03733626655383509
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.04280105837364395,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.04280105837364395
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4,
"acc_stderr": 0.03202563076101735,
"acc_norm": 0.4,
"acc_norm_stderr": 0.03202563076101735
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.042270544512322004,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.042270544512322004
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.46206896551724136,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.46206896551724136,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.022569897074918407,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.022569897074918407
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.04306241259127155,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.04306241259127155
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4935483870967742,
"acc_stderr": 0.02844163823354051,
"acc_norm": 0.4935483870967742,
"acc_norm_stderr": 0.02844163823354051
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2955665024630542,
"acc_stderr": 0.03210494433751458,
"acc_norm": 0.2955665024630542,
"acc_norm_stderr": 0.03210494433751458
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6060606060606061,
"acc_stderr": 0.038154943086889305,
"acc_norm": 0.6060606060606061,
"acc_norm_stderr": 0.038154943086889305
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.4898989898989899,
"acc_stderr": 0.035616254886737454,
"acc_norm": 0.4898989898989899,
"acc_norm_stderr": 0.035616254886737454
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6580310880829016,
"acc_stderr": 0.03423465100104284,
"acc_norm": 0.6580310880829016,
"acc_norm_stderr": 0.03423465100104284
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.43333333333333335,
"acc_stderr": 0.02512465352588513,
"acc_norm": 0.43333333333333335,
"acc_norm_stderr": 0.02512465352588513
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.27037037037037037,
"acc_stderr": 0.027080372815145647,
"acc_norm": 0.27037037037037037,
"acc_norm_stderr": 0.027080372815145647
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4495798319327731,
"acc_stderr": 0.03231293497137707,
"acc_norm": 0.4495798319327731,
"acc_norm_stderr": 0.03231293497137707
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.03734535676787198,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.03734535676787198
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6073394495412844,
"acc_stderr": 0.020937505161201093,
"acc_norm": 0.6073394495412844,
"acc_norm_stderr": 0.020937505161201093
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.20833333333333334,
"acc_stderr": 0.027696910713093936,
"acc_norm": 0.20833333333333334,
"acc_norm_stderr": 0.027696910713093936
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5,
"acc_stderr": 0.03509312031717982,
"acc_norm": 0.5,
"acc_norm_stderr": 0.03509312031717982
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5991561181434599,
"acc_stderr": 0.03190080389473235,
"acc_norm": 0.5991561181434599,
"acc_norm_stderr": 0.03190080389473235
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.547085201793722,
"acc_stderr": 0.03340867501923324,
"acc_norm": 0.547085201793722,
"acc_norm_stderr": 0.03340867501923324
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6106870229007634,
"acc_stderr": 0.04276486542814591,
"acc_norm": 0.6106870229007634,
"acc_norm_stderr": 0.04276486542814591
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6446280991735537,
"acc_stderr": 0.0436923632657398,
"acc_norm": 0.6446280991735537,
"acc_norm_stderr": 0.0436923632657398
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.04820403072760628,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.04820403072760628
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5030674846625767,
"acc_stderr": 0.03928297078179663,
"acc_norm": 0.5030674846625767,
"acc_norm_stderr": 0.03928297078179663
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.375,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.375,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.5728155339805825,
"acc_stderr": 0.04897957737781168,
"acc_norm": 0.5728155339805825,
"acc_norm_stderr": 0.04897957737781168
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6794871794871795,
"acc_stderr": 0.03057281131029961,
"acc_norm": 0.6794871794871795,
"acc_norm_stderr": 0.03057281131029961
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6360153256704981,
"acc_stderr": 0.017205684809032232,
"acc_norm": 0.6360153256704981,
"acc_norm_stderr": 0.017205684809032232
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.523121387283237,
"acc_stderr": 0.026890297881303118,
"acc_norm": 0.523121387283237,
"acc_norm_stderr": 0.026890297881303118
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.02849199358617156,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.02849199358617156
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5980707395498392,
"acc_stderr": 0.027846476005930473,
"acc_norm": 0.5980707395498392,
"acc_norm_stderr": 0.027846476005930473
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4691358024691358,
"acc_stderr": 0.027767689606833935,
"acc_norm": 0.4691358024691358,
"acc_norm_stderr": 0.027767689606833935
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.35106382978723405,
"acc_stderr": 0.028473501272963764,
"acc_norm": 0.35106382978723405,
"acc_norm_stderr": 0.028473501272963764
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.35658409387222945,
"acc_stderr": 0.012233642989273891,
"acc_norm": 0.35658409387222945,
"acc_norm_stderr": 0.012233642989273891
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4963235294117647,
"acc_stderr": 0.0303720158854282,
"acc_norm": 0.4963235294117647,
"acc_norm_stderr": 0.0303720158854282
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.020087362076702853,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.020087362076702853
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5363636363636364,
"acc_stderr": 0.04776449162396197,
"acc_norm": 0.5363636363636364,
"acc_norm_stderr": 0.04776449162396197
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.47346938775510206,
"acc_stderr": 0.03196412734523272,
"acc_norm": 0.47346938775510206,
"acc_norm_stderr": 0.03196412734523272
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6069651741293532,
"acc_stderr": 0.0345368246603156,
"acc_norm": 0.6069651741293532,
"acc_norm_stderr": 0.0345368246603156
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-virology|5": {
"acc": 0.40963855421686746,
"acc_stderr": 0.03828401115079022,
"acc_norm": 0.40963855421686746,
"acc_norm_stderr": 0.03828401115079022
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.695906432748538,
"acc_stderr": 0.035282112582452306,
"acc_norm": 0.695906432748538,
"acc_norm_stderr": 0.035282112582452306
},
"harness|truthfulqa:mc|0": {
"mc1": 0.27539779681762544,
"mc1_stderr": 0.01563813566777552,
"mc2": 0.4171587207764145,
"mc2_stderr": 0.013880088765252275
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_NousResearch__Nous-Capybara-7B | 2023-10-03T19:28:31.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of NousResearch/Nous-Capybara-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [NousResearch/Nous-Capybara-7B](https://huggingface.co/NousResearch/Nous-Capybara-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NousResearch__Nous-Capybara-7B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-03T19:27:10.043918](https://huggingface.co/datasets/open-llm-leaderboard/details_NousResearch__Nous-Capybara-7B/blob/main/results_2023-10-03T19-27-10.043918.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.48992939465492624,\n\
\ \"acc_stderr\": 0.034858011382079626,\n \"acc_norm\": 0.49378247762880534,\n\
\ \"acc_norm_stderr\": 0.03484120597101563,\n \"mc1\": 0.34761321909424725,\n\
\ \"mc1_stderr\": 0.016670769188897303,\n \"mc2\": 0.5113008018715199,\n\
\ \"mc2_stderr\": 0.01580320776664755\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5170648464163823,\n \"acc_stderr\": 0.014602878388536598,\n\
\ \"acc_norm\": 0.552901023890785,\n \"acc_norm_stderr\": 0.014529380160526848\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6158135829516033,\n\
\ \"acc_stderr\": 0.004854082479916908,\n \"acc_norm\": 0.8073093009360686,\n\
\ \"acc_norm_stderr\": 0.003936061455151112\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4962962962962963,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.4962962962962963,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4342105263157895,\n \"acc_stderr\": 0.0403356566784832,\n\
\ \"acc_norm\": 0.4342105263157895,\n \"acc_norm_stderr\": 0.0403356566784832\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.51,\n\
\ \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5283018867924528,\n \"acc_stderr\": 0.0307235352490061,\n\
\ \"acc_norm\": 0.5283018867924528,\n \"acc_norm_stderr\": 0.0307235352490061\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4930555555555556,\n\
\ \"acc_stderr\": 0.04180806750294938,\n \"acc_norm\": 0.4930555555555556,\n\
\ \"acc_norm_stderr\": 0.04180806750294938\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n\
\ \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.44508670520231214,\n\
\ \"acc_stderr\": 0.03789401760283648,\n \"acc_norm\": 0.44508670520231214,\n\
\ \"acc_norm_stderr\": 0.03789401760283648\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.04440521906179328,\n\
\ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.04440521906179328\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n\
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.451063829787234,\n \"acc_stderr\": 0.03252909619613197,\n\
\ \"acc_norm\": 0.451063829787234,\n \"acc_norm_stderr\": 0.03252909619613197\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.22807017543859648,\n\
\ \"acc_stderr\": 0.03947152782669415,\n \"acc_norm\": 0.22807017543859648,\n\
\ \"acc_norm_stderr\": 0.03947152782669415\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.32413793103448274,\n \"acc_stderr\": 0.03900432069185555,\n\
\ \"acc_norm\": 0.32413793103448274,\n \"acc_norm_stderr\": 0.03900432069185555\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.31746031746031744,\n \"acc_stderr\": 0.023973861998992072,\n \"\
acc_norm\": 0.31746031746031744,\n \"acc_norm_stderr\": 0.023973861998992072\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3253968253968254,\n\
\ \"acc_stderr\": 0.04190596438871136,\n \"acc_norm\": 0.3253968253968254,\n\
\ \"acc_norm_stderr\": 0.04190596438871136\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5612903225806452,\n\
\ \"acc_stderr\": 0.028229497320317213,\n \"acc_norm\": 0.5612903225806452,\n\
\ \"acc_norm_stderr\": 0.028229497320317213\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.35960591133004927,\n \"acc_stderr\": 0.03376458246509567,\n\
\ \"acc_norm\": 0.35960591133004927,\n \"acc_norm_stderr\": 0.03376458246509567\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.03681050869161549,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03681050869161549\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6717171717171717,\n \"acc_stderr\": 0.033456784227567746,\n \"\
acc_norm\": 0.6717171717171717,\n \"acc_norm_stderr\": 0.033456784227567746\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7564766839378239,\n \"acc_stderr\": 0.030975436386845443,\n\
\ \"acc_norm\": 0.7564766839378239,\n \"acc_norm_stderr\": 0.030975436386845443\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.47435897435897434,\n \"acc_stderr\": 0.025317649726448656,\n\
\ \"acc_norm\": 0.47435897435897434,\n \"acc_norm_stderr\": 0.025317649726448656\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2814814814814815,\n \"acc_stderr\": 0.027420019350945284,\n \
\ \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.027420019350945284\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.47058823529411764,\n \"acc_stderr\": 0.03242225027115007,\n\
\ \"acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.03242225027115007\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"\
acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6880733944954128,\n \"acc_stderr\": 0.019862967976707245,\n \"\
acc_norm\": 0.6880733944954128,\n \"acc_norm_stderr\": 0.019862967976707245\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3425925925925926,\n \"acc_stderr\": 0.032365852526021595,\n \"\
acc_norm\": 0.3425925925925926,\n \"acc_norm_stderr\": 0.032365852526021595\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6666666666666666,\n \"acc_stderr\": 0.03308611113236435,\n \"\
acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03308611113236435\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6919831223628692,\n \"acc_stderr\": 0.030052389335605702,\n \
\ \"acc_norm\": 0.6919831223628692,\n \"acc_norm_stderr\": 0.030052389335605702\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6053811659192825,\n\
\ \"acc_stderr\": 0.03280400504755291,\n \"acc_norm\": 0.6053811659192825,\n\
\ \"acc_norm_stderr\": 0.03280400504755291\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.549618320610687,\n \"acc_stderr\": 0.04363643698524779,\n\
\ \"acc_norm\": 0.549618320610687,\n \"acc_norm_stderr\": 0.04363643698524779\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6446280991735537,\n \"acc_stderr\": 0.04369236326573981,\n \"\
acc_norm\": 0.6446280991735537,\n \"acc_norm_stderr\": 0.04369236326573981\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6018518518518519,\n\
\ \"acc_stderr\": 0.04732332615978813,\n \"acc_norm\": 0.6018518518518519,\n\
\ \"acc_norm_stderr\": 0.04732332615978813\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6012269938650306,\n \"acc_stderr\": 0.03847021420456024,\n\
\ \"acc_norm\": 0.6012269938650306,\n \"acc_norm_stderr\": 0.03847021420456024\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n\
\ \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.39285714285714285,\n\
\ \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6116504854368932,\n \"acc_stderr\": 0.04825729337356389,\n\
\ \"acc_norm\": 0.6116504854368932,\n \"acc_norm_stderr\": 0.04825729337356389\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7435897435897436,\n\
\ \"acc_stderr\": 0.02860595370200425,\n \"acc_norm\": 0.7435897435897436,\n\
\ \"acc_norm_stderr\": 0.02860595370200425\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.58,\n \"acc_stderr\": 0.04960449637488583,\n \
\ \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.04960449637488583\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6960408684546615,\n\
\ \"acc_stderr\": 0.016448321686769046,\n \"acc_norm\": 0.6960408684546615,\n\
\ \"acc_norm_stderr\": 0.016448321686769046\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5317919075144508,\n \"acc_stderr\": 0.026864624366756643,\n\
\ \"acc_norm\": 0.5317919075144508,\n \"acc_norm_stderr\": 0.026864624366756643\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.47058823529411764,\n \"acc_stderr\": 0.028580341065138293,\n\
\ \"acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.028580341065138293\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.594855305466238,\n\
\ \"acc_stderr\": 0.027882383791325956,\n \"acc_norm\": 0.594855305466238,\n\
\ \"acc_norm_stderr\": 0.027882383791325956\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5740740740740741,\n \"acc_stderr\": 0.027513747284379424,\n\
\ \"acc_norm\": 0.5740740740740741,\n \"acc_norm_stderr\": 0.027513747284379424\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.36524822695035464,\n \"acc_stderr\": 0.02872386385328128,\n \
\ \"acc_norm\": 0.36524822695035464,\n \"acc_norm_stderr\": 0.02872386385328128\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3767926988265971,\n\
\ \"acc_stderr\": 0.012376459593894402,\n \"acc_norm\": 0.3767926988265971,\n\
\ \"acc_norm_stderr\": 0.012376459593894402\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5147058823529411,\n \"acc_stderr\": 0.03035969707904612,\n\
\ \"acc_norm\": 0.5147058823529411,\n \"acc_norm_stderr\": 0.03035969707904612\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4820261437908497,\n \"acc_stderr\": 0.020214761037872404,\n \
\ \"acc_norm\": 0.4820261437908497,\n \"acc_norm_stderr\": 0.020214761037872404\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5636363636363636,\n\
\ \"acc_stderr\": 0.04750185058907296,\n \"acc_norm\": 0.5636363636363636,\n\
\ \"acc_norm_stderr\": 0.04750185058907296\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.031680911612338825,\n\
\ \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.031680911612338825\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6766169154228856,\n\
\ \"acc_stderr\": 0.03307615947979033,\n \"acc_norm\": 0.6766169154228856,\n\
\ \"acc_norm_stderr\": 0.03307615947979033\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3855421686746988,\n\
\ \"acc_stderr\": 0.03789134424611551,\n \"acc_norm\": 0.3855421686746988,\n\
\ \"acc_norm_stderr\": 0.03789134424611551\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.695906432748538,\n \"acc_stderr\": 0.0352821125824523,\n\
\ \"acc_norm\": 0.695906432748538,\n \"acc_norm_stderr\": 0.0352821125824523\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.34761321909424725,\n\
\ \"mc1_stderr\": 0.016670769188897303,\n \"mc2\": 0.5113008018715199,\n\
\ \"mc2_stderr\": 0.01580320776664755\n }\n}\n```"
repo_url: https://huggingface.co/NousResearch/Nous-Capybara-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|arc:challenge|25_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hellaswag|10_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T19-27-10.043918.parquet'
- config_name: results
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- results_2023-10-03T19-27-10.043918.parquet
- split: latest
path:
- results_2023-10-03T19-27-10.043918.parquet
---
# Dataset Card for Evaluation run of NousResearch/Nous-Capybara-7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/NousResearch/Nous-Capybara-7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [NousResearch/Nous-Capybara-7B](https://huggingface.co/NousResearch/Nous-Capybara-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NousResearch__Nous-Capybara-7B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-03T19:27:10.043918](https://huggingface.co/datasets/open-llm-leaderboard/details_NousResearch__Nous-Capybara-7B/blob/main/results_2023-10-03T19-27-10.043918.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.48992939465492624,
"acc_stderr": 0.034858011382079626,
"acc_norm": 0.49378247762880534,
"acc_norm_stderr": 0.03484120597101563,
"mc1": 0.34761321909424725,
"mc1_stderr": 0.016670769188897303,
"mc2": 0.5113008018715199,
"mc2_stderr": 0.01580320776664755
},
"harness|arc:challenge|25": {
"acc": 0.5170648464163823,
"acc_stderr": 0.014602878388536598,
"acc_norm": 0.552901023890785,
"acc_norm_stderr": 0.014529380160526848
},
"harness|hellaswag|10": {
"acc": 0.6158135829516033,
"acc_stderr": 0.004854082479916908,
"acc_norm": 0.8073093009360686,
"acc_norm_stderr": 0.003936061455151112
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4962962962962963,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.4962962962962963,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4342105263157895,
"acc_stderr": 0.0403356566784832,
"acc_norm": 0.4342105263157895,
"acc_norm_stderr": 0.0403356566784832
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5283018867924528,
"acc_stderr": 0.0307235352490061,
"acc_norm": 0.5283018867924528,
"acc_norm_stderr": 0.0307235352490061
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4930555555555556,
"acc_stderr": 0.04180806750294938,
"acc_norm": 0.4930555555555556,
"acc_norm_stderr": 0.04180806750294938
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.44508670520231214,
"acc_stderr": 0.03789401760283648,
"acc_norm": 0.44508670520231214,
"acc_norm_stderr": 0.03789401760283648
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.04440521906179328,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.04440521906179328
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.451063829787234,
"acc_stderr": 0.03252909619613197,
"acc_norm": 0.451063829787234,
"acc_norm_stderr": 0.03252909619613197
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.22807017543859648,
"acc_stderr": 0.03947152782669415,
"acc_norm": 0.22807017543859648,
"acc_norm_stderr": 0.03947152782669415
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.32413793103448274,
"acc_stderr": 0.03900432069185555,
"acc_norm": 0.32413793103448274,
"acc_norm_stderr": 0.03900432069185555
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.023973861998992072,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.023973861998992072
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3253968253968254,
"acc_stderr": 0.04190596438871136,
"acc_norm": 0.3253968253968254,
"acc_norm_stderr": 0.04190596438871136
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5612903225806452,
"acc_stderr": 0.028229497320317213,
"acc_norm": 0.5612903225806452,
"acc_norm_stderr": 0.028229497320317213
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.35960591133004927,
"acc_stderr": 0.03376458246509567,
"acc_norm": 0.35960591133004927,
"acc_norm_stderr": 0.03376458246509567
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03681050869161549,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03681050869161549
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6717171717171717,
"acc_stderr": 0.033456784227567746,
"acc_norm": 0.6717171717171717,
"acc_norm_stderr": 0.033456784227567746
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7564766839378239,
"acc_stderr": 0.030975436386845443,
"acc_norm": 0.7564766839378239,
"acc_norm_stderr": 0.030975436386845443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.47435897435897434,
"acc_stderr": 0.025317649726448656,
"acc_norm": 0.47435897435897434,
"acc_norm_stderr": 0.025317649726448656
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.027420019350945284,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.027420019350945284
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.47058823529411764,
"acc_stderr": 0.03242225027115007,
"acc_norm": 0.47058823529411764,
"acc_norm_stderr": 0.03242225027115007
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6880733944954128,
"acc_stderr": 0.019862967976707245,
"acc_norm": 0.6880733944954128,
"acc_norm_stderr": 0.019862967976707245
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3425925925925926,
"acc_stderr": 0.032365852526021595,
"acc_norm": 0.3425925925925926,
"acc_norm_stderr": 0.032365852526021595
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03308611113236435,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03308611113236435
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6919831223628692,
"acc_stderr": 0.030052389335605702,
"acc_norm": 0.6919831223628692,
"acc_norm_stderr": 0.030052389335605702
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6053811659192825,
"acc_stderr": 0.03280400504755291,
"acc_norm": 0.6053811659192825,
"acc_norm_stderr": 0.03280400504755291
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.549618320610687,
"acc_stderr": 0.04363643698524779,
"acc_norm": 0.549618320610687,
"acc_norm_stderr": 0.04363643698524779
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6446280991735537,
"acc_stderr": 0.04369236326573981,
"acc_norm": 0.6446280991735537,
"acc_norm_stderr": 0.04369236326573981
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6018518518518519,
"acc_stderr": 0.04732332615978813,
"acc_norm": 0.6018518518518519,
"acc_norm_stderr": 0.04732332615978813
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6012269938650306,
"acc_stderr": 0.03847021420456024,
"acc_norm": 0.6012269938650306,
"acc_norm_stderr": 0.03847021420456024
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.39285714285714285,
"acc_stderr": 0.04635550135609976,
"acc_norm": 0.39285714285714285,
"acc_norm_stderr": 0.04635550135609976
},
"harness|hendrycksTest-management|5": {
"acc": 0.6116504854368932,
"acc_stderr": 0.04825729337356389,
"acc_norm": 0.6116504854368932,
"acc_norm_stderr": 0.04825729337356389
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7435897435897436,
"acc_stderr": 0.02860595370200425,
"acc_norm": 0.7435897435897436,
"acc_norm_stderr": 0.02860595370200425
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.58,
"acc_stderr": 0.04960449637488583,
"acc_norm": 0.58,
"acc_norm_stderr": 0.04960449637488583
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6960408684546615,
"acc_stderr": 0.016448321686769046,
"acc_norm": 0.6960408684546615,
"acc_norm_stderr": 0.016448321686769046
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5317919075144508,
"acc_stderr": 0.026864624366756643,
"acc_norm": 0.5317919075144508,
"acc_norm_stderr": 0.026864624366756643
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.47058823529411764,
"acc_stderr": 0.028580341065138293,
"acc_norm": 0.47058823529411764,
"acc_norm_stderr": 0.028580341065138293
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.594855305466238,
"acc_stderr": 0.027882383791325956,
"acc_norm": 0.594855305466238,
"acc_norm_stderr": 0.027882383791325956
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5740740740740741,
"acc_stderr": 0.027513747284379424,
"acc_norm": 0.5740740740740741,
"acc_norm_stderr": 0.027513747284379424
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.36524822695035464,
"acc_stderr": 0.02872386385328128,
"acc_norm": 0.36524822695035464,
"acc_norm_stderr": 0.02872386385328128
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3767926988265971,
"acc_stderr": 0.012376459593894402,
"acc_norm": 0.3767926988265971,
"acc_norm_stderr": 0.012376459593894402
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5147058823529411,
"acc_stderr": 0.03035969707904612,
"acc_norm": 0.5147058823529411,
"acc_norm_stderr": 0.03035969707904612
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4820261437908497,
"acc_stderr": 0.020214761037872404,
"acc_norm": 0.4820261437908497,
"acc_norm_stderr": 0.020214761037872404
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5636363636363636,
"acc_stderr": 0.04750185058907296,
"acc_norm": 0.5636363636363636,
"acc_norm_stderr": 0.04750185058907296
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.031680911612338825,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.031680911612338825
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6766169154228856,
"acc_stderr": 0.03307615947979033,
"acc_norm": 0.6766169154228856,
"acc_norm_stderr": 0.03307615947979033
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3855421686746988,
"acc_stderr": 0.03789134424611551,
"acc_norm": 0.3855421686746988,
"acc_norm_stderr": 0.03789134424611551
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.695906432748538,
"acc_stderr": 0.0352821125824523,
"acc_norm": 0.695906432748538,
"acc_norm_stderr": 0.0352821125824523
},
"harness|truthfulqa:mc|0": {
"mc1": 0.34761321909424725,
"mc1_stderr": 0.016670769188897303,
"mc2": 0.5113008018715199,
"mc2_stderr": 0.01580320776664755
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_mncai__Llama2-7B-guanaco-1k | 2023-10-03T19:30:30.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of mncai/Llama2-7B-guanaco-1k
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [mncai/Llama2-7B-guanaco-1k](https://huggingface.co/mncai/Llama2-7B-guanaco-1k)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mncai__Llama2-7B-guanaco-1k\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-03T19:29:13.374969](https://huggingface.co/datasets/open-llm-leaderboard/details_mncai__Llama2-7B-guanaco-1k/blob/main/results_2023-10-03T19-29-13.374969.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.48223917491095497,\n\
\ \"acc_stderr\": 0.03531211391105555,\n \"acc_norm\": 0.4860341805278453,\n\
\ \"acc_norm_stderr\": 0.03529557569112071,\n \"mc1\": 0.31211750305997554,\n\
\ \"mc1_stderr\": 0.016220756769520926,\n \"mc2\": 0.4769485190285405,\n\
\ \"mc2_stderr\": 0.015017841350265305\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5204778156996587,\n \"acc_stderr\": 0.014599131353035004,\n\
\ \"acc_norm\": 0.5511945392491467,\n \"acc_norm_stderr\": 0.014534599585097667\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6121290579565823,\n\
\ \"acc_stderr\": 0.004862690594815707,\n \"acc_norm\": 0.8053176658036247,\n\
\ \"acc_norm_stderr\": 0.003951467386597723\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4144736842105263,\n \"acc_stderr\": 0.04008973785779206,\n\
\ \"acc_norm\": 0.4144736842105263,\n \"acc_norm_stderr\": 0.04008973785779206\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5056603773584906,\n \"acc_stderr\": 0.030770900763851316,\n\
\ \"acc_norm\": 0.5056603773584906,\n \"acc_norm_stderr\": 0.030770900763851316\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4722222222222222,\n\
\ \"acc_stderr\": 0.04174752578923185,\n \"acc_norm\": 0.4722222222222222,\n\
\ \"acc_norm_stderr\": 0.04174752578923185\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\"\
: 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4624277456647399,\n\
\ \"acc_stderr\": 0.0380168510452446,\n \"acc_norm\": 0.4624277456647399,\n\
\ \"acc_norm_stderr\": 0.0380168510452446\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.17647058823529413,\n \"acc_stderr\": 0.0379328118530781,\n\
\ \"acc_norm\": 0.17647058823529413,\n \"acc_norm_stderr\": 0.0379328118530781\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.55,\n \"acc_stderr\": 0.04999999999999999,\n \"acc_norm\": 0.55,\n\
\ \"acc_norm_stderr\": 0.04999999999999999\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.43829787234042555,\n \"acc_stderr\": 0.03243618636108101,\n\
\ \"acc_norm\": 0.43829787234042555,\n \"acc_norm_stderr\": 0.03243618636108101\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.04266339443159393,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.04266339443159393\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728763,\n\
\ \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728763\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.29894179894179895,\n \"acc_stderr\": 0.023577604791655802,\n \"\
acc_norm\": 0.29894179894179895,\n \"acc_norm_stderr\": 0.023577604791655802\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.29365079365079366,\n\
\ \"acc_stderr\": 0.04073524322147125,\n \"acc_norm\": 0.29365079365079366,\n\
\ \"acc_norm_stderr\": 0.04073524322147125\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5225806451612903,\n\
\ \"acc_stderr\": 0.02841498501970786,\n \"acc_norm\": 0.5225806451612903,\n\
\ \"acc_norm_stderr\": 0.02841498501970786\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3645320197044335,\n \"acc_stderr\": 0.033864057460620905,\n\
\ \"acc_norm\": 0.3645320197044335,\n \"acc_norm_stderr\": 0.033864057460620905\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6121212121212121,\n \"acc_stderr\": 0.038049136539710114,\n\
\ \"acc_norm\": 0.6121212121212121,\n \"acc_norm_stderr\": 0.038049136539710114\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6111111111111112,\n \"acc_stderr\": 0.0347327959083696,\n \"acc_norm\"\
: 0.6111111111111112,\n \"acc_norm_stderr\": 0.0347327959083696\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.6735751295336787,\n \"acc_stderr\": 0.033840286211432945,\n\
\ \"acc_norm\": 0.6735751295336787,\n \"acc_norm_stderr\": 0.033840286211432945\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.46923076923076923,\n \"acc_stderr\": 0.025302958890850154,\n\
\ \"acc_norm\": 0.46923076923076923,\n \"acc_norm_stderr\": 0.025302958890850154\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085626,\n \
\ \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085626\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.44537815126050423,\n \"acc_stderr\": 0.032284106267163895,\n\
\ \"acc_norm\": 0.44537815126050423,\n \"acc_norm_stderr\": 0.032284106267163895\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526733,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526733\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6697247706422018,\n \"acc_stderr\": 0.020164466336342977,\n \"\
acc_norm\": 0.6697247706422018,\n \"acc_norm_stderr\": 0.020164466336342977\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3055555555555556,\n \"acc_stderr\": 0.03141554629402546,\n \"\
acc_norm\": 0.3055555555555556,\n \"acc_norm_stderr\": 0.03141554629402546\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5441176470588235,\n \"acc_stderr\": 0.03495624522015476,\n \"\
acc_norm\": 0.5441176470588235,\n \"acc_norm_stderr\": 0.03495624522015476\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6371308016877637,\n \"acc_stderr\": 0.03129920825530213,\n \
\ \"acc_norm\": 0.6371308016877637,\n \"acc_norm_stderr\": 0.03129920825530213\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5426008968609866,\n\
\ \"acc_stderr\": 0.033435777055830646,\n \"acc_norm\": 0.5426008968609866,\n\
\ \"acc_norm_stderr\": 0.033435777055830646\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6030534351145038,\n \"acc_stderr\": 0.04291135671009224,\n\
\ \"acc_norm\": 0.6030534351145038,\n \"acc_norm_stderr\": 0.04291135671009224\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.628099173553719,\n \"acc_stderr\": 0.044120158066245044,\n \"\
acc_norm\": 0.628099173553719,\n \"acc_norm_stderr\": 0.044120158066245044\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5370370370370371,\n\
\ \"acc_stderr\": 0.04820403072760627,\n \"acc_norm\": 0.5370370370370371,\n\
\ \"acc_norm_stderr\": 0.04820403072760627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5276073619631901,\n \"acc_stderr\": 0.03922378290610991,\n\
\ \"acc_norm\": 0.5276073619631901,\n \"acc_norm_stderr\": 0.03922378290610991\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n\
\ \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \
\ \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5825242718446602,\n \"acc_stderr\": 0.048828405482122375,\n\
\ \"acc_norm\": 0.5825242718446602,\n \"acc_norm_stderr\": 0.048828405482122375\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.02934311479809446,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.02934311479809446\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6475095785440613,\n\
\ \"acc_stderr\": 0.01708415024408138,\n \"acc_norm\": 0.6475095785440613,\n\
\ \"acc_norm_stderr\": 0.01708415024408138\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5346820809248555,\n \"acc_stderr\": 0.026854257928258872,\n\
\ \"acc_norm\": 0.5346820809248555,\n \"acc_norm_stderr\": 0.026854257928258872\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.29832402234636873,\n\
\ \"acc_stderr\": 0.015301840045129278,\n \"acc_norm\": 0.29832402234636873,\n\
\ \"acc_norm_stderr\": 0.015301840045129278\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5228758169934641,\n \"acc_stderr\": 0.028599936776089782,\n\
\ \"acc_norm\": 0.5228758169934641,\n \"acc_norm_stderr\": 0.028599936776089782\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5884244372990354,\n\
\ \"acc_stderr\": 0.027950481494401262,\n \"acc_norm\": 0.5884244372990354,\n\
\ \"acc_norm_stderr\": 0.027950481494401262\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.4876543209876543,\n \"acc_stderr\": 0.027812262269327228,\n\
\ \"acc_norm\": 0.4876543209876543,\n \"acc_norm_stderr\": 0.027812262269327228\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.36524822695035464,\n \"acc_stderr\": 0.028723863853281278,\n \
\ \"acc_norm\": 0.36524822695035464,\n \"acc_norm_stderr\": 0.028723863853281278\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.36897001303780963,\n\
\ \"acc_stderr\": 0.01232393665017486,\n \"acc_norm\": 0.36897001303780963,\n\
\ \"acc_norm_stderr\": 0.01232393665017486\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5367647058823529,\n \"acc_stderr\": 0.030290619180485694,\n\
\ \"acc_norm\": 0.5367647058823529,\n \"acc_norm_stderr\": 0.030290619180485694\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.45098039215686275,\n \"acc_stderr\": 0.02013038831290453,\n \
\ \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.02013038831290453\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5363636363636364,\n\
\ \"acc_stderr\": 0.04776449162396197,\n \"acc_norm\": 0.5363636363636364,\n\
\ \"acc_norm_stderr\": 0.04776449162396197\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4897959183673469,\n \"acc_stderr\": 0.03200255347893782,\n\
\ \"acc_norm\": 0.4897959183673469,\n \"acc_norm_stderr\": 0.03200255347893782\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.03333333333333335,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.03333333333333335\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.41566265060240964,\n\
\ \"acc_stderr\": 0.038367221765980515,\n \"acc_norm\": 0.41566265060240964,\n\
\ \"acc_norm_stderr\": 0.038367221765980515\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7076023391812866,\n \"acc_stderr\": 0.03488647713457923,\n\
\ \"acc_norm\": 0.7076023391812866,\n \"acc_norm_stderr\": 0.03488647713457923\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.31211750305997554,\n\
\ \"mc1_stderr\": 0.016220756769520926,\n \"mc2\": 0.4769485190285405,\n\
\ \"mc2_stderr\": 0.015017841350265305\n }\n}\n```"
repo_url: https://huggingface.co/mncai/Llama2-7B-guanaco-1k
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|arc:challenge|25_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hellaswag|10_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T19-29-13.374969.parquet'
- config_name: results
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- results_2023-10-03T19-29-13.374969.parquet
- split: latest
path:
- results_2023-10-03T19-29-13.374969.parquet
---
# Dataset Card for Evaluation run of mncai/Llama2-7B-guanaco-1k
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/mncai/Llama2-7B-guanaco-1k
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [mncai/Llama2-7B-guanaco-1k](https://huggingface.co/mncai/Llama2-7B-guanaco-1k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_mncai__Llama2-7B-guanaco-1k",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-03T19:29:13.374969](https://huggingface.co/datasets/open-llm-leaderboard/details_mncai__Llama2-7B-guanaco-1k/blob/main/results_2023-10-03T19-29-13.374969.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.48223917491095497,
"acc_stderr": 0.03531211391105555,
"acc_norm": 0.4860341805278453,
"acc_norm_stderr": 0.03529557569112071,
"mc1": 0.31211750305997554,
"mc1_stderr": 0.016220756769520926,
"mc2": 0.4769485190285405,
"mc2_stderr": 0.015017841350265305
},
"harness|arc:challenge|25": {
"acc": 0.5204778156996587,
"acc_stderr": 0.014599131353035004,
"acc_norm": 0.5511945392491467,
"acc_norm_stderr": 0.014534599585097667
},
"harness|hellaswag|10": {
"acc": 0.6121290579565823,
"acc_stderr": 0.004862690594815707,
"acc_norm": 0.8053176658036247,
"acc_norm_stderr": 0.003951467386597723
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4144736842105263,
"acc_stderr": 0.04008973785779206,
"acc_norm": 0.4144736842105263,
"acc_norm_stderr": 0.04008973785779206
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5056603773584906,
"acc_stderr": 0.030770900763851316,
"acc_norm": 0.5056603773584906,
"acc_norm_stderr": 0.030770900763851316
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.04174752578923185,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.04174752578923185
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4624277456647399,
"acc_stderr": 0.0380168510452446,
"acc_norm": 0.4624277456647399,
"acc_norm_stderr": 0.0380168510452446
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.17647058823529413,
"acc_stderr": 0.0379328118530781,
"acc_norm": 0.17647058823529413,
"acc_norm_stderr": 0.0379328118530781
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.55,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.55,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.43829787234042555,
"acc_stderr": 0.03243618636108101,
"acc_norm": 0.43829787234042555,
"acc_norm_stderr": 0.03243618636108101
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159393,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159393
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.29894179894179895,
"acc_stderr": 0.023577604791655802,
"acc_norm": 0.29894179894179895,
"acc_norm_stderr": 0.023577604791655802
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.04073524322147125,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.04073524322147125
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5225806451612903,
"acc_stderr": 0.02841498501970786,
"acc_norm": 0.5225806451612903,
"acc_norm_stderr": 0.02841498501970786
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3645320197044335,
"acc_stderr": 0.033864057460620905,
"acc_norm": 0.3645320197044335,
"acc_norm_stderr": 0.033864057460620905
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6121212121212121,
"acc_stderr": 0.038049136539710114,
"acc_norm": 0.6121212121212121,
"acc_norm_stderr": 0.038049136539710114
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.0347327959083696,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.0347327959083696
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6735751295336787,
"acc_stderr": 0.033840286211432945,
"acc_norm": 0.6735751295336787,
"acc_norm_stderr": 0.033840286211432945
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.46923076923076923,
"acc_stderr": 0.025302958890850154,
"acc_norm": 0.46923076923076923,
"acc_norm_stderr": 0.025302958890850154
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.027195934804085626,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.027195934804085626
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.44537815126050423,
"acc_stderr": 0.032284106267163895,
"acc_norm": 0.44537815126050423,
"acc_norm_stderr": 0.032284106267163895
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526733,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526733
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6697247706422018,
"acc_stderr": 0.020164466336342977,
"acc_norm": 0.6697247706422018,
"acc_norm_stderr": 0.020164466336342977
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3055555555555556,
"acc_stderr": 0.03141554629402546,
"acc_norm": 0.3055555555555556,
"acc_norm_stderr": 0.03141554629402546
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5441176470588235,
"acc_stderr": 0.03495624522015476,
"acc_norm": 0.5441176470588235,
"acc_norm_stderr": 0.03495624522015476
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6371308016877637,
"acc_stderr": 0.03129920825530213,
"acc_norm": 0.6371308016877637,
"acc_norm_stderr": 0.03129920825530213
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5426008968609866,
"acc_stderr": 0.033435777055830646,
"acc_norm": 0.5426008968609866,
"acc_norm_stderr": 0.033435777055830646
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6030534351145038,
"acc_stderr": 0.04291135671009224,
"acc_norm": 0.6030534351145038,
"acc_norm_stderr": 0.04291135671009224
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.628099173553719,
"acc_stderr": 0.044120158066245044,
"acc_norm": 0.628099173553719,
"acc_norm_stderr": 0.044120158066245044
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.04820403072760627,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.04820403072760627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5276073619631901,
"acc_stderr": 0.03922378290610991,
"acc_norm": 0.5276073619631901,
"acc_norm_stderr": 0.03922378290610991
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.375,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.375,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.5825242718446602,
"acc_stderr": 0.048828405482122375,
"acc_norm": 0.5825242718446602,
"acc_norm_stderr": 0.048828405482122375
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.02934311479809446,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.02934311479809446
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6475095785440613,
"acc_stderr": 0.01708415024408138,
"acc_norm": 0.6475095785440613,
"acc_norm_stderr": 0.01708415024408138
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5346820809248555,
"acc_stderr": 0.026854257928258872,
"acc_norm": 0.5346820809248555,
"acc_norm_stderr": 0.026854257928258872
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.29832402234636873,
"acc_stderr": 0.015301840045129278,
"acc_norm": 0.29832402234636873,
"acc_norm_stderr": 0.015301840045129278
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5228758169934641,
"acc_stderr": 0.028599936776089782,
"acc_norm": 0.5228758169934641,
"acc_norm_stderr": 0.028599936776089782
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5884244372990354,
"acc_stderr": 0.027950481494401262,
"acc_norm": 0.5884244372990354,
"acc_norm_stderr": 0.027950481494401262
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4876543209876543,
"acc_stderr": 0.027812262269327228,
"acc_norm": 0.4876543209876543,
"acc_norm_stderr": 0.027812262269327228
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.36524822695035464,
"acc_stderr": 0.028723863853281278,
"acc_norm": 0.36524822695035464,
"acc_norm_stderr": 0.028723863853281278
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.36897001303780963,
"acc_stderr": 0.01232393665017486,
"acc_norm": 0.36897001303780963,
"acc_norm_stderr": 0.01232393665017486
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5367647058823529,
"acc_stderr": 0.030290619180485694,
"acc_norm": 0.5367647058823529,
"acc_norm_stderr": 0.030290619180485694
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.02013038831290453,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.02013038831290453
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5363636363636364,
"acc_stderr": 0.04776449162396197,
"acc_norm": 0.5363636363636364,
"acc_norm_stderr": 0.04776449162396197
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4897959183673469,
"acc_stderr": 0.03200255347893782,
"acc_norm": 0.4897959183673469,
"acc_norm_stderr": 0.03200255347893782
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03333333333333335,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03333333333333335
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-virology|5": {
"acc": 0.41566265060240964,
"acc_stderr": 0.038367221765980515,
"acc_norm": 0.41566265060240964,
"acc_norm_stderr": 0.038367221765980515
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7076023391812866,
"acc_stderr": 0.03488647713457923,
"acc_norm": 0.7076023391812866,
"acc_norm_stderr": 0.03488647713457923
},
"harness|truthfulqa:mc|0": {
"mc1": 0.31211750305997554,
"mc1_stderr": 0.016220756769520926,
"mc2": 0.4769485190285405,
"mc2_stderr": 0.015017841350265305
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_mncai__Llama2-7B-guanaco-dolphin-500 | 2023-10-03T19:38:13.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of mncai/Llama2-7B-guanaco-dolphin-500
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [mncai/Llama2-7B-guanaco-dolphin-500](https://huggingface.co/mncai/Llama2-7B-guanaco-dolphin-500)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mncai__Llama2-7B-guanaco-dolphin-500\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-03T19:36:50.573905](https://huggingface.co/datasets/open-llm-leaderboard/details_mncai__Llama2-7B-guanaco-dolphin-500/blob/main/results_2023-10-03T19-36-50.573905.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.48983448652651995,\n\
\ \"acc_stderr\": 0.035363637901079645,\n \"acc_norm\": 0.49384112425804355,\n\
\ \"acc_norm_stderr\": 0.03534517584489329,\n \"mc1\": 0.31701346389228885,\n\
\ \"mc1_stderr\": 0.016289203374403385,\n \"mc2\": 0.46938056221953073,\n\
\ \"mc2_stderr\": 0.015439179764216509\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5255972696245734,\n \"acc_stderr\": 0.014592230885298964,\n\
\ \"acc_norm\": 0.5674061433447098,\n \"acc_norm_stderr\": 0.014478005694182526\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6216889065923122,\n\
\ \"acc_stderr\": 0.0048397464915235135,\n \"acc_norm\": 0.8162716590320653,\n\
\ \"acc_norm_stderr\": 0.003864710367645059\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4888888888888889,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.4888888888888889,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4605263157894737,\n \"acc_stderr\": 0.04056242252249033,\n\
\ \"acc_norm\": 0.4605263157894737,\n \"acc_norm_stderr\": 0.04056242252249033\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.51,\n\
\ \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.47547169811320755,\n \"acc_stderr\": 0.030735822206205615,\n\
\ \"acc_norm\": 0.47547169811320755,\n \"acc_norm_stderr\": 0.030735822206205615\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4861111111111111,\n\
\ \"acc_stderr\": 0.041795966175810016,\n \"acc_norm\": 0.4861111111111111,\n\
\ \"acc_norm_stderr\": 0.041795966175810016\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\"\
: 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4508670520231214,\n\
\ \"acc_stderr\": 0.0379401267469703,\n \"acc_norm\": 0.4508670520231214,\n\
\ \"acc_norm_stderr\": 0.0379401267469703\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.18627450980392157,\n \"acc_stderr\": 0.03873958714149352,\n\
\ \"acc_norm\": 0.18627450980392157,\n \"acc_norm_stderr\": 0.03873958714149352\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4340425531914894,\n \"acc_stderr\": 0.03240038086792747,\n\
\ \"acc_norm\": 0.4340425531914894,\n \"acc_norm_stderr\": 0.03240038086792747\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.04266339443159393,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.04266339443159393\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728763,\n\
\ \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728763\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30158730158730157,\n \"acc_stderr\": 0.0236369759961018,\n \"\
acc_norm\": 0.30158730158730157,\n \"acc_norm_stderr\": 0.0236369759961018\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3412698412698413,\n\
\ \"acc_stderr\": 0.04240799327574925,\n \"acc_norm\": 0.3412698412698413,\n\
\ \"acc_norm_stderr\": 0.04240799327574925\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5387096774193548,\n\
\ \"acc_stderr\": 0.02835863485983694,\n \"acc_norm\": 0.5387096774193548,\n\
\ \"acc_norm_stderr\": 0.02835863485983694\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3399014778325123,\n \"acc_stderr\": 0.0333276906841079,\n\
\ \"acc_norm\": 0.3399014778325123,\n \"acc_norm_stderr\": 0.0333276906841079\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6060606060606061,\n \"acc_stderr\": 0.038154943086889305,\n\
\ \"acc_norm\": 0.6060606060606061,\n \"acc_norm_stderr\": 0.038154943086889305\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5909090909090909,\n \"acc_stderr\": 0.03502975799413007,\n \"\
acc_norm\": 0.5909090909090909,\n \"acc_norm_stderr\": 0.03502975799413007\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6632124352331606,\n \"acc_stderr\": 0.03410780251836184,\n\
\ \"acc_norm\": 0.6632124352331606,\n \"acc_norm_stderr\": 0.03410780251836184\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.45384615384615384,\n \"acc_stderr\": 0.025242770987126177,\n\
\ \"acc_norm\": 0.45384615384615384,\n \"acc_norm_stderr\": 0.025242770987126177\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3074074074074074,\n \"acc_stderr\": 0.02813325257881563,\n \
\ \"acc_norm\": 0.3074074074074074,\n \"acc_norm_stderr\": 0.02813325257881563\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.46218487394957986,\n \"acc_stderr\": 0.032385469487589795,\n\
\ \"acc_norm\": 0.46218487394957986,\n \"acc_norm_stderr\": 0.032385469487589795\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.671559633027523,\n \"acc_stderr\": 0.020135902797298405,\n \"\
acc_norm\": 0.671559633027523,\n \"acc_norm_stderr\": 0.020135902797298405\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3101851851851852,\n \"acc_stderr\": 0.03154696285656629,\n \"\
acc_norm\": 0.3101851851851852,\n \"acc_norm_stderr\": 0.03154696285656629\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5490196078431373,\n \"acc_stderr\": 0.03492406104163613,\n \"\
acc_norm\": 0.5490196078431373,\n \"acc_norm_stderr\": 0.03492406104163613\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6371308016877637,\n \"acc_stderr\": 0.031299208255302136,\n \
\ \"acc_norm\": 0.6371308016877637,\n \"acc_norm_stderr\": 0.031299208255302136\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5650224215246636,\n\
\ \"acc_stderr\": 0.033272833702713445,\n \"acc_norm\": 0.5650224215246636,\n\
\ \"acc_norm_stderr\": 0.033272833702713445\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5954198473282443,\n \"acc_stderr\": 0.043046937953806645,\n\
\ \"acc_norm\": 0.5954198473282443,\n \"acc_norm_stderr\": 0.043046937953806645\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6446280991735537,\n \"acc_stderr\": 0.0436923632657398,\n \"acc_norm\"\
: 0.6446280991735537,\n \"acc_norm_stderr\": 0.0436923632657398\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5462962962962963,\n\
\ \"acc_stderr\": 0.04812917324536823,\n \"acc_norm\": 0.5462962962962963,\n\
\ \"acc_norm_stderr\": 0.04812917324536823\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5214723926380368,\n \"acc_stderr\": 0.03924746876751129,\n\
\ \"acc_norm\": 0.5214723926380368,\n \"acc_norm_stderr\": 0.03924746876751129\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n\
\ \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \
\ \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6019417475728155,\n \"acc_stderr\": 0.048467482539772386,\n\
\ \"acc_norm\": 0.6019417475728155,\n \"acc_norm_stderr\": 0.048467482539772386\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7393162393162394,\n\
\ \"acc_stderr\": 0.028760348956523414,\n \"acc_norm\": 0.7393162393162394,\n\
\ \"acc_norm_stderr\": 0.028760348956523414\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6513409961685823,\n\
\ \"acc_stderr\": 0.01704124314349097,\n \"acc_norm\": 0.6513409961685823,\n\
\ \"acc_norm_stderr\": 0.01704124314349097\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.546242774566474,\n \"acc_stderr\": 0.026803720583206174,\n\
\ \"acc_norm\": 0.546242774566474,\n \"acc_norm_stderr\": 0.026803720583206174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.25251396648044694,\n\
\ \"acc_stderr\": 0.014530330201468636,\n \"acc_norm\": 0.25251396648044694,\n\
\ \"acc_norm_stderr\": 0.014530330201468636\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5424836601307189,\n \"acc_stderr\": 0.02852638345214263,\n\
\ \"acc_norm\": 0.5424836601307189,\n \"acc_norm_stderr\": 0.02852638345214263\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5884244372990354,\n\
\ \"acc_stderr\": 0.027950481494401262,\n \"acc_norm\": 0.5884244372990354,\n\
\ \"acc_norm_stderr\": 0.027950481494401262\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.027815973433878014,\n\
\ \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.027815973433878014\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.37943262411347517,\n \"acc_stderr\": 0.0289473388516141,\n \
\ \"acc_norm\": 0.37943262411347517,\n \"acc_norm_stderr\": 0.0289473388516141\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3663624511082138,\n\
\ \"acc_stderr\": 0.012305658346838444,\n \"acc_norm\": 0.3663624511082138,\n\
\ \"acc_norm_stderr\": 0.012305658346838444\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5625,\n \"acc_stderr\": 0.030134614954403924,\n \
\ \"acc_norm\": 0.5625,\n \"acc_norm_stderr\": 0.030134614954403924\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.47058823529411764,\n \"acc_stderr\": 0.02019280827143379,\n \
\ \"acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.02019280827143379\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5545454545454546,\n\
\ \"acc_stderr\": 0.047605488214603246,\n \"acc_norm\": 0.5545454545454546,\n\
\ \"acc_norm_stderr\": 0.047605488214603246\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4857142857142857,\n \"acc_stderr\": 0.03199615232806287,\n\
\ \"acc_norm\": 0.4857142857142857,\n \"acc_norm_stderr\": 0.03199615232806287\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6766169154228856,\n\
\ \"acc_stderr\": 0.03307615947979033,\n \"acc_norm\": 0.6766169154228856,\n\
\ \"acc_norm_stderr\": 0.03307615947979033\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.41566265060240964,\n\
\ \"acc_stderr\": 0.038367221765980515,\n \"acc_norm\": 0.41566265060240964,\n\
\ \"acc_norm_stderr\": 0.038367221765980515\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7192982456140351,\n \"acc_stderr\": 0.034462962170884265,\n\
\ \"acc_norm\": 0.7192982456140351,\n \"acc_norm_stderr\": 0.034462962170884265\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.31701346389228885,\n\
\ \"mc1_stderr\": 0.016289203374403385,\n \"mc2\": 0.46938056221953073,\n\
\ \"mc2_stderr\": 0.015439179764216509\n }\n}\n```"
repo_url: https://huggingface.co/mncai/Llama2-7B-guanaco-dolphin-500
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|arc:challenge|25_2023-10-03T19-36-50.573905.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-03T19-36-50.573905.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hellaswag|10_2023-10-03T19-36-50.573905.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-03T19-36-50.573905.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T19-36-50.573905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T19-36-50.573905.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T19-36-50.573905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T19-36-50.573905.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T19-36-50.573905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T19-36-50.573905.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T19-36-50.573905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T19-36-50.573905.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T19-36-50.573905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T19-36-50.573905.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T19-36-50.573905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T19-36-50.573905.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T19-36-50.573905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T19-36-50.573905.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T19-36-50.573905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T19-36-50.573905.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T19-36-50.573905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T19-36-50.573905.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T19-36-50.573905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T19-36-50.573905.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T19-36-50.573905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T19-36-50.573905.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T19-36-50.573905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T19-36-50.573905.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T19-36-50.573905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T19-36-50.573905.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T19-36-50.573905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T19-36-50.573905.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T19-36-50.573905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T19-36-50.573905.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T19-36-50.573905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T19-36-50.573905.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T19-36-50.573905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T19-36-50.573905.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T19-36-50.573905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T19-36-50.573905.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T19-36-50.573905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T19-36-50.573905.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T19-36-50.573905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T19-36-50.573905.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T19-36-50.573905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T19-36-50.573905.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T19-36-50.573905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T19-36-50.573905.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T19-36-50.573905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T19-36-50.573905.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T19-36-50.573905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T19-36-50.573905.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T19-36-50.573905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T19-36-50.573905.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T19-36-50.573905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T19-36-50.573905.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T19-36-50.573905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T19-36-50.573905.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T19-36-50.573905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T19-36-50.573905.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T19-36-50.573905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T19-36-50.573905.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T19-36-50.573905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T19-36-50.573905.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T19-36-50.573905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T19-36-50.573905.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T19-36-50.573905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T19-36-50.573905.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T19-36-50.573905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T19-36-50.573905.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T19-36-50.573905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T19-36-50.573905.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T19-36-50.573905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T19-36-50.573905.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T19-36-50.573905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T19-36-50.573905.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T19-36-50.573905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T19-36-50.573905.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T19-36-50.573905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T19-36-50.573905.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T19-36-50.573905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T19-36-50.573905.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T19-36-50.573905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T19-36-50.573905.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T19-36-50.573905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T19-36-50.573905.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T19-36-50.573905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T19-36-50.573905.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T19-36-50.573905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T19-36-50.573905.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T19-36-50.573905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T19-36-50.573905.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T19-36-50.573905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T19-36-50.573905.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T19-36-50.573905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T19-36-50.573905.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T19-36-50.573905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T19-36-50.573905.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T19-36-50.573905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T19-36-50.573905.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T19-36-50.573905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T19-36-50.573905.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T19-36-50.573905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T19-36-50.573905.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T19-36-50.573905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T19-36-50.573905.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T19-36-50.573905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T19-36-50.573905.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T19-36-50.573905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T19-36-50.573905.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T19-36-50.573905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T19-36-50.573905.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T19-36-50.573905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T19-36-50.573905.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T19-36-50.573905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T19-36-50.573905.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T19-36-50.573905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T19-36-50.573905.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T19-36-50.573905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T19-36-50.573905.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T19-36-50.573905.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T19-36-50.573905.parquet'
- config_name: results
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- results_2023-10-03T19-36-50.573905.parquet
- split: latest
path:
- results_2023-10-03T19-36-50.573905.parquet
---
# Dataset Card for Evaluation run of mncai/Llama2-7B-guanaco-dolphin-500
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/mncai/Llama2-7B-guanaco-dolphin-500
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [mncai/Llama2-7B-guanaco-dolphin-500](https://huggingface.co/mncai/Llama2-7B-guanaco-dolphin-500) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_mncai__Llama2-7B-guanaco-dolphin-500",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-03T19:36:50.573905](https://huggingface.co/datasets/open-llm-leaderboard/details_mncai__Llama2-7B-guanaco-dolphin-500/blob/main/results_2023-10-03T19-36-50.573905.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.48983448652651995,
"acc_stderr": 0.035363637901079645,
"acc_norm": 0.49384112425804355,
"acc_norm_stderr": 0.03534517584489329,
"mc1": 0.31701346389228885,
"mc1_stderr": 0.016289203374403385,
"mc2": 0.46938056221953073,
"mc2_stderr": 0.015439179764216509
},
"harness|arc:challenge|25": {
"acc": 0.5255972696245734,
"acc_stderr": 0.014592230885298964,
"acc_norm": 0.5674061433447098,
"acc_norm_stderr": 0.014478005694182526
},
"harness|hellaswag|10": {
"acc": 0.6216889065923122,
"acc_stderr": 0.0048397464915235135,
"acc_norm": 0.8162716590320653,
"acc_norm_stderr": 0.003864710367645059
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4888888888888889,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.4888888888888889,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4605263157894737,
"acc_stderr": 0.04056242252249033,
"acc_norm": 0.4605263157894737,
"acc_norm_stderr": 0.04056242252249033
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.47547169811320755,
"acc_stderr": 0.030735822206205615,
"acc_norm": 0.47547169811320755,
"acc_norm_stderr": 0.030735822206205615
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4861111111111111,
"acc_stderr": 0.041795966175810016,
"acc_norm": 0.4861111111111111,
"acc_norm_stderr": 0.041795966175810016
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4508670520231214,
"acc_stderr": 0.0379401267469703,
"acc_norm": 0.4508670520231214,
"acc_norm_stderr": 0.0379401267469703
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.18627450980392157,
"acc_stderr": 0.03873958714149352,
"acc_norm": 0.18627450980392157,
"acc_norm_stderr": 0.03873958714149352
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4340425531914894,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.4340425531914894,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159393,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159393
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30158730158730157,
"acc_stderr": 0.0236369759961018,
"acc_norm": 0.30158730158730157,
"acc_norm_stderr": 0.0236369759961018
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.04240799327574925,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.04240799327574925
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5387096774193548,
"acc_stderr": 0.02835863485983694,
"acc_norm": 0.5387096774193548,
"acc_norm_stderr": 0.02835863485983694
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3399014778325123,
"acc_stderr": 0.0333276906841079,
"acc_norm": 0.3399014778325123,
"acc_norm_stderr": 0.0333276906841079
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6060606060606061,
"acc_stderr": 0.038154943086889305,
"acc_norm": 0.6060606060606061,
"acc_norm_stderr": 0.038154943086889305
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5909090909090909,
"acc_stderr": 0.03502975799413007,
"acc_norm": 0.5909090909090909,
"acc_norm_stderr": 0.03502975799413007
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6632124352331606,
"acc_stderr": 0.03410780251836184,
"acc_norm": 0.6632124352331606,
"acc_norm_stderr": 0.03410780251836184
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.45384615384615384,
"acc_stderr": 0.025242770987126177,
"acc_norm": 0.45384615384615384,
"acc_norm_stderr": 0.025242770987126177
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3074074074074074,
"acc_stderr": 0.02813325257881563,
"acc_norm": 0.3074074074074074,
"acc_norm_stderr": 0.02813325257881563
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.46218487394957986,
"acc_stderr": 0.032385469487589795,
"acc_norm": 0.46218487394957986,
"acc_norm_stderr": 0.032385469487589795
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.671559633027523,
"acc_stderr": 0.020135902797298405,
"acc_norm": 0.671559633027523,
"acc_norm_stderr": 0.020135902797298405
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3101851851851852,
"acc_stderr": 0.03154696285656629,
"acc_norm": 0.3101851851851852,
"acc_norm_stderr": 0.03154696285656629
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5490196078431373,
"acc_stderr": 0.03492406104163613,
"acc_norm": 0.5490196078431373,
"acc_norm_stderr": 0.03492406104163613
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6371308016877637,
"acc_stderr": 0.031299208255302136,
"acc_norm": 0.6371308016877637,
"acc_norm_stderr": 0.031299208255302136
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5650224215246636,
"acc_stderr": 0.033272833702713445,
"acc_norm": 0.5650224215246636,
"acc_norm_stderr": 0.033272833702713445
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5954198473282443,
"acc_stderr": 0.043046937953806645,
"acc_norm": 0.5954198473282443,
"acc_norm_stderr": 0.043046937953806645
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6446280991735537,
"acc_stderr": 0.0436923632657398,
"acc_norm": 0.6446280991735537,
"acc_norm_stderr": 0.0436923632657398
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.04812917324536823,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.04812917324536823
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5214723926380368,
"acc_stderr": 0.03924746876751129,
"acc_norm": 0.5214723926380368,
"acc_norm_stderr": 0.03924746876751129
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.375,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.375,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.6019417475728155,
"acc_stderr": 0.048467482539772386,
"acc_norm": 0.6019417475728155,
"acc_norm_stderr": 0.048467482539772386
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7393162393162394,
"acc_stderr": 0.028760348956523414,
"acc_norm": 0.7393162393162394,
"acc_norm_stderr": 0.028760348956523414
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6513409961685823,
"acc_stderr": 0.01704124314349097,
"acc_norm": 0.6513409961685823,
"acc_norm_stderr": 0.01704124314349097
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.546242774566474,
"acc_stderr": 0.026803720583206174,
"acc_norm": 0.546242774566474,
"acc_norm_stderr": 0.026803720583206174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.25251396648044694,
"acc_stderr": 0.014530330201468636,
"acc_norm": 0.25251396648044694,
"acc_norm_stderr": 0.014530330201468636
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5424836601307189,
"acc_stderr": 0.02852638345214263,
"acc_norm": 0.5424836601307189,
"acc_norm_stderr": 0.02852638345214263
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5884244372990354,
"acc_stderr": 0.027950481494401262,
"acc_norm": 0.5884244372990354,
"acc_norm_stderr": 0.027950481494401262
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.027815973433878014,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.027815973433878014
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.37943262411347517,
"acc_stderr": 0.0289473388516141,
"acc_norm": 0.37943262411347517,
"acc_norm_stderr": 0.0289473388516141
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3663624511082138,
"acc_stderr": 0.012305658346838444,
"acc_norm": 0.3663624511082138,
"acc_norm_stderr": 0.012305658346838444
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5625,
"acc_stderr": 0.030134614954403924,
"acc_norm": 0.5625,
"acc_norm_stderr": 0.030134614954403924
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.47058823529411764,
"acc_stderr": 0.02019280827143379,
"acc_norm": 0.47058823529411764,
"acc_norm_stderr": 0.02019280827143379
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5545454545454546,
"acc_stderr": 0.047605488214603246,
"acc_norm": 0.5545454545454546,
"acc_norm_stderr": 0.047605488214603246
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4857142857142857,
"acc_stderr": 0.03199615232806287,
"acc_norm": 0.4857142857142857,
"acc_norm_stderr": 0.03199615232806287
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6766169154228856,
"acc_stderr": 0.03307615947979033,
"acc_norm": 0.6766169154228856,
"acc_norm_stderr": 0.03307615947979033
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-virology|5": {
"acc": 0.41566265060240964,
"acc_stderr": 0.038367221765980515,
"acc_norm": 0.41566265060240964,
"acc_norm_stderr": 0.038367221765980515
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7192982456140351,
"acc_stderr": 0.034462962170884265,
"acc_norm": 0.7192982456140351,
"acc_norm_stderr": 0.034462962170884265
},
"harness|truthfulqa:mc|0": {
"mc1": 0.31701346389228885,
"mc1_stderr": 0.016289203374403385,
"mc2": 0.46938056221953073,
"mc2_stderr": 0.015439179764216509
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.