datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
open-llm-leaderboard/details_PulsarAI__Neural-una-cybertron-7b | ---
pretty_name: Evaluation run of PulsarAI/Neural-una-cybertron-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [PulsarAI/Neural-una-cybertron-7b](https://huggingface.co/PulsarAI/Neural-una-cybertron-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PulsarAI__Neural-una-cybertron-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-09T19:49:04.690282](https://huggingface.co/datasets/open-llm-leaderboard/details_PulsarAI__Neural-una-cybertron-7b/blob/main/results_2023-12-09T19-49-04.690282.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6303659109315263,\n\
\ \"acc_stderr\": 0.032701507219088696,\n \"acc_norm\": 0.6326609738082676,\n\
\ \"acc_norm_stderr\": 0.033364878181962175,\n \"mc1\": 0.49938800489596086,\n\
\ \"mc1_stderr\": 0.01750348793889251,\n \"mc2\": 0.6498823682901811,\n\
\ \"mc2_stderr\": 0.01528184743332698\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6604095563139932,\n \"acc_stderr\": 0.013839039762820164,\n\
\ \"acc_norm\": 0.6902730375426621,\n \"acc_norm_stderr\": 0.013512058415238363\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6704839673371839,\n\
\ \"acc_stderr\": 0.004690768393854475,\n \"acc_norm\": 0.8450507866958773,\n\
\ \"acc_norm_stderr\": 0.0036111673029597625\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7302631578947368,\n \"acc_stderr\": 0.03611780560284898,\n\
\ \"acc_norm\": 0.7302631578947368,\n \"acc_norm_stderr\": 0.03611780560284898\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.02863723563980089,\n\
\ \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.02863723563980089\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6875,\n\
\ \"acc_stderr\": 0.038760854559127644,\n \"acc_norm\": 0.6875,\n\
\ \"acc_norm_stderr\": 0.038760854559127644\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n\
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.04897104952726366,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.04897104952726366\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n\
\ \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n\
\ \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3783068783068783,\n \"acc_stderr\": 0.024976954053155247,\n \"\
acc_norm\": 0.3783068783068783,\n \"acc_norm_stderr\": 0.024976954053155247\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7612903225806451,\n \"acc_stderr\": 0.02425107126220884,\n \"\
acc_norm\": 0.7612903225806451,\n \"acc_norm_stderr\": 0.02425107126220884\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.47783251231527096,\n \"acc_stderr\": 0.03514528562175008,\n \"\
acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.03514528562175008\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\"\
: 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.032876667586034906,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.032876667586034906\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7777777777777778,\n \"acc_stderr\": 0.029620227874790492,\n \"\
acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.029620227874790492\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8497409326424871,\n \"acc_stderr\": 0.025787723180723875,\n\
\ \"acc_norm\": 0.8497409326424871,\n \"acc_norm_stderr\": 0.025787723180723875\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6564102564102564,\n \"acc_stderr\": 0.024078696580635474,\n\
\ \"acc_norm\": 0.6564102564102564,\n \"acc_norm_stderr\": 0.024078696580635474\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3037037037037037,\n \"acc_stderr\": 0.028037929969114993,\n \
\ \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.028037929969114993\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.029953823891887034,\n\
\ \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.029953823891887034\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8311926605504587,\n \"acc_stderr\": 0.016060056268530343,\n \"\
acc_norm\": 0.8311926605504587,\n \"acc_norm_stderr\": 0.016060056268530343\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n \"\
acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588667,\n \"\
acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588667\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229966,\n \
\ \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229966\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n\
\ \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n\
\ \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7251908396946565,\n \"acc_stderr\": 0.03915345408847835,\n\
\ \"acc_norm\": 0.7251908396946565,\n \"acc_norm_stderr\": 0.03915345408847835\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516302,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516302\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\
\ \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n\
\ \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7177914110429447,\n \"acc_stderr\": 0.03536117886664742,\n\
\ \"acc_norm\": 0.7177914110429447,\n \"acc_norm_stderr\": 0.03536117886664742\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8461538461538461,\n\
\ \"acc_stderr\": 0.023636873317489277,\n \"acc_norm\": 0.8461538461538461,\n\
\ \"acc_norm_stderr\": 0.023636873317489277\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.80970625798212,\n\
\ \"acc_stderr\": 0.014036945850381398,\n \"acc_norm\": 0.80970625798212,\n\
\ \"acc_norm_stderr\": 0.014036945850381398\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6965317919075145,\n \"acc_stderr\": 0.024752411960917205,\n\
\ \"acc_norm\": 0.6965317919075145,\n \"acc_norm_stderr\": 0.024752411960917205\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3787709497206704,\n\
\ \"acc_stderr\": 0.016223533510365113,\n \"acc_norm\": 0.3787709497206704,\n\
\ \"acc_norm_stderr\": 0.016223533510365113\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6601307189542484,\n \"acc_stderr\": 0.027121956071388856,\n\
\ \"acc_norm\": 0.6601307189542484,\n \"acc_norm_stderr\": 0.027121956071388856\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7160493827160493,\n \"acc_stderr\": 0.025089478523765137,\n\
\ \"acc_norm\": 0.7160493827160493,\n \"acc_norm_stderr\": 0.025089478523765137\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.475177304964539,\n \"acc_stderr\": 0.02979071924382972,\n \
\ \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.02979071924382972\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44784876140808344,\n\
\ \"acc_stderr\": 0.01270058240476822,\n \"acc_norm\": 0.44784876140808344,\n\
\ \"acc_norm_stderr\": 0.01270058240476822\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.0290294228156814,\n\
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.0290294228156814\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6552287581699346,\n \"acc_stderr\": 0.019228322018696647,\n \
\ \"acc_norm\": 0.6552287581699346,\n \"acc_norm_stderr\": 0.019228322018696647\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7020408163265306,\n \"acc_stderr\": 0.029279567411065677,\n\
\ \"acc_norm\": 0.7020408163265306,\n \"acc_norm_stderr\": 0.029279567411065677\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\
\ \"acc_stderr\": 0.02650859065623325,\n \"acc_norm\": 0.8308457711442786,\n\
\ \"acc_norm_stderr\": 0.02650859065623325\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.0312678171466318,\n\
\ \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.0312678171466318\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.49938800489596086,\n\
\ \"mc1_stderr\": 0.01750348793889251,\n \"mc2\": 0.6498823682901811,\n\
\ \"mc2_stderr\": 0.01528184743332698\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8066298342541437,\n \"acc_stderr\": 0.011099796645920524\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5231235784685367,\n \
\ \"acc_stderr\": 0.013757748544245336\n }\n}\n```"
repo_url: https://huggingface.co/PulsarAI/Neural-una-cybertron-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_09T19_49_04.690282
path:
- '**/details_harness|arc:challenge|25_2023-12-09T19-49-04.690282.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-09T19-49-04.690282.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_09T19_49_04.690282
path:
- '**/details_harness|gsm8k|5_2023-12-09T19-49-04.690282.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-09T19-49-04.690282.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_09T19_49_04.690282
path:
- '**/details_harness|hellaswag|10_2023-12-09T19-49-04.690282.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-09T19-49-04.690282.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_09T19_49_04.690282
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T19-49-04.690282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T19-49-04.690282.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T19-49-04.690282.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_09T19_49_04.690282
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T19-49-04.690282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T19-49-04.690282.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_09T19_49_04.690282
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T19-49-04.690282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T19-49-04.690282.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_09T19_49_04.690282
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T19-49-04.690282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T19-49-04.690282.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_09T19_49_04.690282
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T19-49-04.690282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T19-49-04.690282.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_09T19_49_04.690282
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T19-49-04.690282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T19-49-04.690282.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_09T19_49_04.690282
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T19-49-04.690282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T19-49-04.690282.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_09T19_49_04.690282
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T19-49-04.690282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T19-49-04.690282.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_09T19_49_04.690282
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T19-49-04.690282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T19-49-04.690282.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_09T19_49_04.690282
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T19-49-04.690282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T19-49-04.690282.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_09T19_49_04.690282
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T19-49-04.690282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T19-49-04.690282.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_09T19_49_04.690282
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T19-49-04.690282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T19-49-04.690282.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_09T19_49_04.690282
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T19-49-04.690282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T19-49-04.690282.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_09T19_49_04.690282
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T19-49-04.690282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T19-49-04.690282.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_09T19_49_04.690282
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T19-49-04.690282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T19-49-04.690282.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_09T19_49_04.690282
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T19-49-04.690282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T19-49-04.690282.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_09T19_49_04.690282
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T19-49-04.690282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T19-49-04.690282.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_09T19_49_04.690282
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T19-49-04.690282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T19-49-04.690282.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_09T19_49_04.690282
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T19-49-04.690282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T19-49-04.690282.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_09T19_49_04.690282
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T19-49-04.690282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T19-49-04.690282.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_09T19_49_04.690282
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T19-49-04.690282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T19-49-04.690282.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_09T19_49_04.690282
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T19-49-04.690282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T19-49-04.690282.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_09T19_49_04.690282
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T19-49-04.690282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T19-49-04.690282.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_09T19_49_04.690282
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T19-49-04.690282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T19-49-04.690282.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_09T19_49_04.690282
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T19-49-04.690282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T19-49-04.690282.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_09T19_49_04.690282
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T19-49-04.690282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T19-49-04.690282.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_09T19_49_04.690282
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T19-49-04.690282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T19-49-04.690282.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_09T19_49_04.690282
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T19-49-04.690282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T19-49-04.690282.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_09T19_49_04.690282
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T19-49-04.690282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T19-49-04.690282.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_09T19_49_04.690282
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T19-49-04.690282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T19-49-04.690282.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_09T19_49_04.690282
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T19-49-04.690282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T19-49-04.690282.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_09T19_49_04.690282
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T19-49-04.690282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T19-49-04.690282.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_09T19_49_04.690282
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T19-49-04.690282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T19-49-04.690282.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_09T19_49_04.690282
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T19-49-04.690282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T19-49-04.690282.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_09T19_49_04.690282
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T19-49-04.690282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T19-49-04.690282.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_09T19_49_04.690282
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T19-49-04.690282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T19-49-04.690282.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_09T19_49_04.690282
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T19-49-04.690282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T19-49-04.690282.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_09T19_49_04.690282
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T19-49-04.690282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T19-49-04.690282.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_09T19_49_04.690282
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T19-49-04.690282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T19-49-04.690282.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_09T19_49_04.690282
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-09T19-49-04.690282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-09T19-49-04.690282.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_09T19_49_04.690282
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T19-49-04.690282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T19-49-04.690282.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_09T19_49_04.690282
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T19-49-04.690282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T19-49-04.690282.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_09T19_49_04.690282
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T19-49-04.690282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T19-49-04.690282.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_09T19_49_04.690282
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T19-49-04.690282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T19-49-04.690282.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_09T19_49_04.690282
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T19-49-04.690282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T19-49-04.690282.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_09T19_49_04.690282
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T19-49-04.690282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T19-49-04.690282.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_09T19_49_04.690282
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T19-49-04.690282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T19-49-04.690282.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_09T19_49_04.690282
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T19-49-04.690282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T19-49-04.690282.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_09T19_49_04.690282
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T19-49-04.690282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T19-49-04.690282.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_09T19_49_04.690282
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T19-49-04.690282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T19-49-04.690282.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_09T19_49_04.690282
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T19-49-04.690282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T19-49-04.690282.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_09T19_49_04.690282
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T19-49-04.690282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T19-49-04.690282.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_09T19_49_04.690282
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T19-49-04.690282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T19-49-04.690282.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_09T19_49_04.690282
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T19-49-04.690282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T19-49-04.690282.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_09T19_49_04.690282
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T19-49-04.690282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T19-49-04.690282.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_09T19_49_04.690282
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T19-49-04.690282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T19-49-04.690282.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_09T19_49_04.690282
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T19-49-04.690282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T19-49-04.690282.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_09T19_49_04.690282
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T19-49-04.690282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T19-49-04.690282.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_09T19_49_04.690282
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-09T19-49-04.690282.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-09T19-49-04.690282.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_09T19_49_04.690282
path:
- '**/details_harness|winogrande|5_2023-12-09T19-49-04.690282.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-09T19-49-04.690282.parquet'
- config_name: results
data_files:
- split: 2023_12_09T19_49_04.690282
path:
- results_2023-12-09T19-49-04.690282.parquet
- split: latest
path:
- results_2023-12-09T19-49-04.690282.parquet
---
# Dataset Card for Evaluation run of PulsarAI/Neural-una-cybertron-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/PulsarAI/Neural-una-cybertron-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [PulsarAI/Neural-una-cybertron-7b](https://huggingface.co/PulsarAI/Neural-una-cybertron-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_PulsarAI__Neural-una-cybertron-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-09T19:49:04.690282](https://huggingface.co/datasets/open-llm-leaderboard/details_PulsarAI__Neural-una-cybertron-7b/blob/main/results_2023-12-09T19-49-04.690282.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6303659109315263,
"acc_stderr": 0.032701507219088696,
"acc_norm": 0.6326609738082676,
"acc_norm_stderr": 0.033364878181962175,
"mc1": 0.49938800489596086,
"mc1_stderr": 0.01750348793889251,
"mc2": 0.6498823682901811,
"mc2_stderr": 0.01528184743332698
},
"harness|arc:challenge|25": {
"acc": 0.6604095563139932,
"acc_stderr": 0.013839039762820164,
"acc_norm": 0.6902730375426621,
"acc_norm_stderr": 0.013512058415238363
},
"harness|hellaswag|10": {
"acc": 0.6704839673371839,
"acc_stderr": 0.004690768393854475,
"acc_norm": 0.8450507866958773,
"acc_norm_stderr": 0.0036111673029597625
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7302631578947368,
"acc_stderr": 0.03611780560284898,
"acc_norm": 0.7302631578947368,
"acc_norm_stderr": 0.03611780560284898
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.02863723563980089,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.02863723563980089
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6875,
"acc_stderr": 0.038760854559127644,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.038760854559127644
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.04897104952726366,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.04897104952726366
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3783068783068783,
"acc_stderr": 0.024976954053155247,
"acc_norm": 0.3783068783068783,
"acc_norm_stderr": 0.024976954053155247
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04444444444444449,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04444444444444449
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7612903225806451,
"acc_stderr": 0.02425107126220884,
"acc_norm": 0.7612903225806451,
"acc_norm_stderr": 0.02425107126220884
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.47783251231527096,
"acc_stderr": 0.03514528562175008,
"acc_norm": 0.47783251231527096,
"acc_norm_stderr": 0.03514528562175008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.032876667586034906,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.032876667586034906
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.029620227874790492,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.029620227874790492
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8497409326424871,
"acc_stderr": 0.025787723180723875,
"acc_norm": 0.8497409326424871,
"acc_norm_stderr": 0.025787723180723875
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6564102564102564,
"acc_stderr": 0.024078696580635474,
"acc_norm": 0.6564102564102564,
"acc_norm_stderr": 0.024078696580635474
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.028037929969114993,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.028037929969114993
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6932773109243697,
"acc_stderr": 0.029953823891887034,
"acc_norm": 0.6932773109243697,
"acc_norm_stderr": 0.029953823891887034
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8311926605504587,
"acc_stderr": 0.016060056268530343,
"acc_norm": 0.8311926605504587,
"acc_norm_stderr": 0.016060056268530343
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.03407632093854051,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.03407632093854051
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588667,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588667
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7763713080168776,
"acc_stderr": 0.027123298205229966,
"acc_norm": 0.7763713080168776,
"acc_norm_stderr": 0.027123298205229966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7251908396946565,
"acc_stderr": 0.03915345408847835,
"acc_norm": 0.7251908396946565,
"acc_norm_stderr": 0.03915345408847835
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516302,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516302
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7177914110429447,
"acc_stderr": 0.03536117886664742,
"acc_norm": 0.7177914110429447,
"acc_norm_stderr": 0.03536117886664742
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8461538461538461,
"acc_stderr": 0.023636873317489277,
"acc_norm": 0.8461538461538461,
"acc_norm_stderr": 0.023636873317489277
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.80970625798212,
"acc_stderr": 0.014036945850381398,
"acc_norm": 0.80970625798212,
"acc_norm_stderr": 0.014036945850381398
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6965317919075145,
"acc_stderr": 0.024752411960917205,
"acc_norm": 0.6965317919075145,
"acc_norm_stderr": 0.024752411960917205
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3787709497206704,
"acc_stderr": 0.016223533510365113,
"acc_norm": 0.3787709497206704,
"acc_norm_stderr": 0.016223533510365113
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6601307189542484,
"acc_stderr": 0.027121956071388856,
"acc_norm": 0.6601307189542484,
"acc_norm_stderr": 0.027121956071388856
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7160493827160493,
"acc_stderr": 0.025089478523765137,
"acc_norm": 0.7160493827160493,
"acc_norm_stderr": 0.025089478523765137
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.02979071924382972,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.02979071924382972
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44784876140808344,
"acc_stderr": 0.01270058240476822,
"acc_norm": 0.44784876140808344,
"acc_norm_stderr": 0.01270058240476822
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.0290294228156814,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.0290294228156814
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6552287581699346,
"acc_stderr": 0.019228322018696647,
"acc_norm": 0.6552287581699346,
"acc_norm_stderr": 0.019228322018696647
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7020408163265306,
"acc_stderr": 0.029279567411065677,
"acc_norm": 0.7020408163265306,
"acc_norm_stderr": 0.029279567411065677
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.02650859065623325,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.02650859065623325
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.0312678171466318,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.0312678171466318
},
"harness|truthfulqa:mc|0": {
"mc1": 0.49938800489596086,
"mc1_stderr": 0.01750348793889251,
"mc2": 0.6498823682901811,
"mc2_stderr": 0.01528184743332698
},
"harness|winogrande|5": {
"acc": 0.8066298342541437,
"acc_stderr": 0.011099796645920524
},
"harness|gsm8k|5": {
"acc": 0.5231235784685367,
"acc_stderr": 0.013757748544245336
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
neil-code/autotrain-data-bert-base-uncased | ---
task_categories:
- text-classification
---
# AutoTrain Dataset for project: bert-base-uncased
## Dataset Description
This dataset has been automatically processed by AutoTrain for project bert-base-uncased.
### Languages
The BCP-47 code for the dataset's language is unk.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"text": "well it's clear now why europeans can't differ niggers and shitskins from human",
"target": 0
},
{
"text": "These boys will then grow up with people making fun of them and they will then hate their parents for ruining their lives.",
"target": 1
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"text": "Value(dtype='string', id=None)",
"target": "ClassLabel(names=['hate_speech', 'no_hate_speech'], id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 798 |
| valid | 200 |
|
on1onmangoes/ZED | ---
language:
- en
license: apache-2.0
size_categories:
- n<1K
task_categories:
- conversational
pretty_name: ZED
dataset_info:
features:
- name: file_name
dtype: string
- name: wav
dtype:
audio:
sampling_rate: 16000
- name: transcription
dtype: string
- name: emotion
sequence: string
- name: duration
dtype: float32
- name: emotion_start
sequence: float32
- name: emotion_end
sequence: float32
splits:
- name: train
num_bytes: 30998607.0
num_examples: 180
download_size: 30998412
dataset_size: 30998607.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for My Dataset
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information) |
result-kand2-sdxl-wuerst-karlo/040dec0a | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 160
num_examples: 10
download_size: 1292
dataset_size: 160
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "040dec0a"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
SaffalPoosh/scribble_controlnet_dataset | ---
dataset_info:
features:
- name: image
dtype: image
- name: scribble
dtype: image
- name: caption
dtype: string
splits:
- name: train
num_bytes: 3632352891.0
num_examples: 10000
download_size: 772501479
dataset_size: 3632352891.0
---
# Dataset Card for "data_coco"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_rte_emphatic_reflex | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 8293
num_examples: 18
- name: train
num_bytes: 6961
num_examples: 17
download_size: 17891
dataset_size: 15254
---
# Dataset Card for "MULTI_VALUE_rte_emphatic_reflex"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
polejowska/NCT-CRC-HE-45k | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': ADI
'1': BACK
'2': DEB
'3': LYM
'4': MUC
'5': MUS
'6': NORM
'7': STR
'8': TUM
splits:
- name: train
num_bytes: 2820558485.0
num_examples: 45000
download_size: 1579401162
dataset_size: 2820558485.0
---
# Dataset Card for "NCT-CRC-HE-45k"
@dataset{kather_jakob_nikolas_2018_1214456,
author = {Kather, Jakob Nikolas and
Halama, Niels and
Marx, Alexander},
title = {{100,000 histological images of human colorectal
cancer and healthy tissue}},
month = apr,
year = 2018,
publisher = {Zenodo},
version = {v0.1},
doi = {10.5281/zenodo.1214456},
url = {https://doi.org/10.5281/zenodo.1214456}
} |
neph1/Alpaca-Lora-GPT4-Swedish-Refined | ---
language:
- sv
---
This is based on: https://huggingface.co/datasets/jeremyc/Alpaca-Lora-GPT4-Swedish
I've done extensive cleaning (but I'm not yet done).
This includes:
Purging erroneous and sometimes offensive generations by the translator
Fixing code instances up to row 10300. All code was botched. There may still be some html instances to fix, but at least all python should be valid.
|
YunDDDD/customllama2 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 5826
num_examples: 39
download_size: 2572
dataset_size: 5826
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
isaaccorley/floating-objects | ---
license: apache-2.0
---
|
RafiBrent/Kather_Data | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': ADI
'1': BACK
'2': DEB
'3': LYM
'4': MUC
'5': MUS
'6': NORM
'7': STR
'8': TUM
splits:
- name: train
num_bytes: 13700396308.48
num_examples: 89996
- name: validation
num_bytes: 1522935967.768
num_examples: 10004
- name: test
num_bytes: 1093018719.36
num_examples: 7180
download_size: 14369418825
dataset_size: 16316350995.608
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
ASCCCCCCCC/mix_info | ---
license: apache-2.0
---
|
jpacifico/French-Alpaca-dataset-Instruct-55K | ---
language:
- fr
size_categories:
- 10K<n<100K
---
55184 french instructions generated by OpenAI GPT-3.5
in Alpaca Format to finetune general models
Created by Jonathan Pacifico
---
license: apache-2.0
---
Please credit my name if you use this dataset in your project. |
benayas/snips_augmented_20pct_v1 | ---
dataset_info:
features:
- name: text
dtype: string
- name: category
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 1030579
num_examples: 13084
download_size: 495145
dataset_size: 1030579
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
anumafzal94/10k-arxiv-4096 | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: text
dtype: string
- name: summary
dtype: string
splits:
- name: test
num_bytes: 216556336
num_examples: 6438
- name: train
num_bytes: 351240876.3954603
num_examples: 10000
- name: validation
num_bytes: 33463774.258004352
num_examples: 996
download_size: 166397342
dataset_size: 601260986.6534647
---
# Dataset Card for "10k-arxiv-4096"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/2f97018c | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 188
num_examples: 10
download_size: 1341
dataset_size: 188
---
# Dataset Card for "2f97018c"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_ddyuudd__mistral_dmbr10_32_sig | ---
pretty_name: Evaluation run of ddyuudd/mistral_dmbr10_32_sig
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ddyuudd/mistral_dmbr10_32_sig](https://huggingface.co/ddyuudd/mistral_dmbr10_32_sig)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ddyuudd__mistral_dmbr10_32_sig\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-07T12:32:41.726473](https://huggingface.co/datasets/open-llm-leaderboard/details_ddyuudd__mistral_dmbr10_32_sig/blob/main/results_2024-03-07T12-32-41.726473.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6116504471153977,\n\
\ \"acc_stderr\": 0.032943322563727395,\n \"acc_norm\": 0.6166317430983218,\n\
\ \"acc_norm_stderr\": 0.03361626974956549,\n \"mc1\": 0.3011015911872705,\n\
\ \"mc1_stderr\": 0.016058999026100612,\n \"mc2\": 0.44335454039876326,\n\
\ \"mc2_stderr\": 0.014873057460151717\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5580204778156996,\n \"acc_stderr\": 0.014512682523128342,\n\
\ \"acc_norm\": 0.5861774744027304,\n \"acc_norm_stderr\": 0.014392730009221005\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6259709221270663,\n\
\ \"acc_stderr\": 0.004828822920915221,\n \"acc_norm\": 0.8257319259111731,\n\
\ \"acc_norm_stderr\": 0.0037856457412359483\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.631578947368421,\n \"acc_stderr\": 0.03925523381052932,\n\
\ \"acc_norm\": 0.631578947368421,\n \"acc_norm_stderr\": 0.03925523381052932\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6641509433962264,\n \"acc_stderr\": 0.029067220146644826,\n\
\ \"acc_norm\": 0.6641509433962264,\n \"acc_norm_stderr\": 0.029067220146644826\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n\
\ \"acc_stderr\": 0.03716177437566017,\n \"acc_norm\": 0.7291666666666666,\n\
\ \"acc_norm_stderr\": 0.03716177437566017\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5664739884393064,\n\
\ \"acc_stderr\": 0.03778621079092056,\n \"acc_norm\": 0.5664739884393064,\n\
\ \"acc_norm_stderr\": 0.03778621079092056\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105652,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105652\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.8,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.8,\n\
\ \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5361702127659574,\n \"acc_stderr\": 0.032600385118357715,\n\
\ \"acc_norm\": 0.5361702127659574,\n \"acc_norm_stderr\": 0.032600385118357715\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n\
\ \"acc_stderr\": 0.046774730044912,\n \"acc_norm\": 0.4473684210526316,\n\
\ \"acc_norm_stderr\": 0.046774730044912\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4126984126984127,\n \"acc_stderr\": 0.02535574126305526,\n \"\
acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.02535574126305526\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n\
\ \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n\
\ \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7258064516129032,\n\
\ \"acc_stderr\": 0.025378139970885196,\n \"acc_norm\": 0.7258064516129032,\n\
\ \"acc_norm_stderr\": 0.025378139970885196\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n\
\ \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.033744026441394036,\n\
\ \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.033744026441394036\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7828282828282829,\n \"acc_stderr\": 0.029376616484945627,\n \"\
acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.029376616484945627\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8134715025906736,\n \"acc_stderr\": 0.02811209121011746,\n\
\ \"acc_norm\": 0.8134715025906736,\n \"acc_norm_stderr\": 0.02811209121011746\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5743589743589743,\n \"acc_stderr\": 0.025069094387296525,\n\
\ \"acc_norm\": 0.5743589743589743,\n \"acc_norm_stderr\": 0.025069094387296525\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131143,\n \
\ \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131143\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6386554621848739,\n \"acc_stderr\": 0.031204691225150016,\n\
\ \"acc_norm\": 0.6386554621848739,\n \"acc_norm_stderr\": 0.031204691225150016\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763743,\n \"\
acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763743\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7889908256880734,\n \"acc_stderr\": 0.01749392240411265,\n \"\
acc_norm\": 0.7889908256880734,\n \"acc_norm_stderr\": 0.01749392240411265\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7892156862745098,\n\
\ \"acc_stderr\": 0.028626547912437406,\n \"acc_norm\": 0.7892156862745098,\n\
\ \"acc_norm_stderr\": 0.028626547912437406\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.7468354430379747,\n \"acc_stderr\": 0.02830465794303529,\n\
\ \"acc_norm\": 0.7468354430379747,\n \"acc_norm_stderr\": 0.02830465794303529\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6502242152466368,\n\
\ \"acc_stderr\": 0.03200736719484503,\n \"acc_norm\": 0.6502242152466368,\n\
\ \"acc_norm_stderr\": 0.03200736719484503\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7129629629629629,\n\
\ \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.7129629629629629,\n\
\ \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.03487825168497892,\n\
\ \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.03487825168497892\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.02190190511507332,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.02190190511507332\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7918263090676884,\n\
\ \"acc_stderr\": 0.014518592248904033,\n \"acc_norm\": 0.7918263090676884,\n\
\ \"acc_norm_stderr\": 0.014518592248904033\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.708092485549133,\n \"acc_stderr\": 0.024476994076247333,\n\
\ \"acc_norm\": 0.708092485549133,\n \"acc_norm_stderr\": 0.024476994076247333\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.34972067039106147,\n\
\ \"acc_stderr\": 0.015949308790233648,\n \"acc_norm\": 0.34972067039106147,\n\
\ \"acc_norm_stderr\": 0.015949308790233648\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.696078431372549,\n \"acc_stderr\": 0.026336613469046626,\n\
\ \"acc_norm\": 0.696078431372549,\n \"acc_norm_stderr\": 0.026336613469046626\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\
\ \"acc_stderr\": 0.025922371788818767,\n \"acc_norm\": 0.7041800643086816,\n\
\ \"acc_norm_stderr\": 0.025922371788818767\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6882716049382716,\n \"acc_stderr\": 0.02577311116963046,\n\
\ \"acc_norm\": 0.6882716049382716,\n \"acc_norm_stderr\": 0.02577311116963046\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42698826597131684,\n\
\ \"acc_stderr\": 0.012633353557534425,\n \"acc_norm\": 0.42698826597131684,\n\
\ \"acc_norm_stderr\": 0.012633353557534425\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6397058823529411,\n \"acc_stderr\": 0.02916312857067073,\n\
\ \"acc_norm\": 0.6397058823529411,\n \"acc_norm_stderr\": 0.02916312857067073\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6225490196078431,\n \"acc_stderr\": 0.01961085147488029,\n \
\ \"acc_norm\": 0.6225490196078431,\n \"acc_norm_stderr\": 0.01961085147488029\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n\
\ \"acc_stderr\": 0.046534298079135075,\n \"acc_norm\": 0.6181818181818182,\n\
\ \"acc_norm_stderr\": 0.046534298079135075\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.689795918367347,\n \"acc_stderr\": 0.029613459872484375,\n\
\ \"acc_norm\": 0.689795918367347,\n \"acc_norm_stderr\": 0.029613459872484375\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421606,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421606\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n\
\ \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3011015911872705,\n\
\ \"mc1_stderr\": 0.016058999026100612,\n \"mc2\": 0.44335454039876326,\n\
\ \"mc2_stderr\": 0.014873057460151717\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7790055248618785,\n \"acc_stderr\": 0.01166122363764341\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.378316906747536,\n \
\ \"acc_stderr\": 0.013358407831777115\n }\n}\n```"
repo_url: https://huggingface.co/ddyuudd/mistral_dmbr10_32_sig
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_07T12_32_41.726473
path:
- '**/details_harness|arc:challenge|25_2024-03-07T12-32-41.726473.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-07T12-32-41.726473.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_07T12_32_41.726473
path:
- '**/details_harness|gsm8k|5_2024-03-07T12-32-41.726473.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-07T12-32-41.726473.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_07T12_32_41.726473
path:
- '**/details_harness|hellaswag|10_2024-03-07T12-32-41.726473.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-07T12-32-41.726473.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_07T12_32_41.726473
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-07T12-32-41.726473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-07T12-32-41.726473.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-07T12-32-41.726473.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_07T12_32_41.726473
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-07T12-32-41.726473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-07T12-32-41.726473.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_07T12_32_41.726473
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-07T12-32-41.726473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-07T12-32-41.726473.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_07T12_32_41.726473
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-07T12-32-41.726473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-07T12-32-41.726473.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_07T12_32_41.726473
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-07T12-32-41.726473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-07T12-32-41.726473.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_07T12_32_41.726473
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-07T12-32-41.726473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-07T12-32-41.726473.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_07T12_32_41.726473
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-07T12-32-41.726473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-07T12-32-41.726473.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_07T12_32_41.726473
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-07T12-32-41.726473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-07T12-32-41.726473.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_07T12_32_41.726473
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-07T12-32-41.726473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-07T12-32-41.726473.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_07T12_32_41.726473
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-07T12-32-41.726473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-07T12-32-41.726473.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_07T12_32_41.726473
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-07T12-32-41.726473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-07T12-32-41.726473.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_07T12_32_41.726473
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-07T12-32-41.726473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-07T12-32-41.726473.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_07T12_32_41.726473
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-07T12-32-41.726473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-07T12-32-41.726473.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_07T12_32_41.726473
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-07T12-32-41.726473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-07T12-32-41.726473.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_07T12_32_41.726473
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-07T12-32-41.726473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-07T12-32-41.726473.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_07T12_32_41.726473
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-07T12-32-41.726473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-07T12-32-41.726473.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_07T12_32_41.726473
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-07T12-32-41.726473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-07T12-32-41.726473.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_07T12_32_41.726473
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-07T12-32-41.726473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-07T12-32-41.726473.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_07T12_32_41.726473
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-07T12-32-41.726473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-07T12-32-41.726473.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_07T12_32_41.726473
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-07T12-32-41.726473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-07T12-32-41.726473.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_07T12_32_41.726473
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-07T12-32-41.726473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-07T12-32-41.726473.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_07T12_32_41.726473
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-07T12-32-41.726473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-07T12-32-41.726473.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_07T12_32_41.726473
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-07T12-32-41.726473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-07T12-32-41.726473.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_07T12_32_41.726473
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-07T12-32-41.726473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-07T12-32-41.726473.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_07T12_32_41.726473
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-07T12-32-41.726473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-07T12-32-41.726473.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_07T12_32_41.726473
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-07T12-32-41.726473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-07T12-32-41.726473.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_07T12_32_41.726473
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-07T12-32-41.726473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-07T12-32-41.726473.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_07T12_32_41.726473
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-07T12-32-41.726473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-07T12-32-41.726473.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_07T12_32_41.726473
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-07T12-32-41.726473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-07T12-32-41.726473.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_07T12_32_41.726473
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-07T12-32-41.726473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-07T12-32-41.726473.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_07T12_32_41.726473
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-07T12-32-41.726473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-07T12-32-41.726473.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_07T12_32_41.726473
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-07T12-32-41.726473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-07T12-32-41.726473.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_07T12_32_41.726473
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-07T12-32-41.726473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-07T12-32-41.726473.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_07T12_32_41.726473
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-07T12-32-41.726473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-07T12-32-41.726473.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_07T12_32_41.726473
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-07T12-32-41.726473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-07T12-32-41.726473.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_07T12_32_41.726473
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-07T12-32-41.726473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-07T12-32-41.726473.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_07T12_32_41.726473
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-07T12-32-41.726473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-07T12-32-41.726473.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_07T12_32_41.726473
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-07T12-32-41.726473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-07T12-32-41.726473.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_07T12_32_41.726473
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-07T12-32-41.726473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-07T12-32-41.726473.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_07T12_32_41.726473
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-07T12-32-41.726473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-07T12-32-41.726473.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_07T12_32_41.726473
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-07T12-32-41.726473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-07T12-32-41.726473.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_07T12_32_41.726473
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-07T12-32-41.726473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-07T12-32-41.726473.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_07T12_32_41.726473
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-07T12-32-41.726473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-07T12-32-41.726473.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_07T12_32_41.726473
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-07T12-32-41.726473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-07T12-32-41.726473.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_07T12_32_41.726473
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-07T12-32-41.726473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-07T12-32-41.726473.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_07T12_32_41.726473
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-07T12-32-41.726473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-07T12-32-41.726473.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_07T12_32_41.726473
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-07T12-32-41.726473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-07T12-32-41.726473.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_07T12_32_41.726473
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-07T12-32-41.726473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-07T12-32-41.726473.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_07T12_32_41.726473
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-07T12-32-41.726473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-07T12-32-41.726473.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_07T12_32_41.726473
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-07T12-32-41.726473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-07T12-32-41.726473.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_07T12_32_41.726473
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-07T12-32-41.726473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-07T12-32-41.726473.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_07T12_32_41.726473
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-07T12-32-41.726473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-07T12-32-41.726473.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_07T12_32_41.726473
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-07T12-32-41.726473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-07T12-32-41.726473.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_07T12_32_41.726473
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-07T12-32-41.726473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-07T12-32-41.726473.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_07T12_32_41.726473
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-07T12-32-41.726473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-07T12-32-41.726473.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_07T12_32_41.726473
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-07T12-32-41.726473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-07T12-32-41.726473.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_07T12_32_41.726473
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-07T12-32-41.726473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-07T12-32-41.726473.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_07T12_32_41.726473
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-07T12-32-41.726473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-07T12-32-41.726473.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_07T12_32_41.726473
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-07T12-32-41.726473.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-07T12-32-41.726473.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_07T12_32_41.726473
path:
- '**/details_harness|winogrande|5_2024-03-07T12-32-41.726473.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-07T12-32-41.726473.parquet'
- config_name: results
data_files:
- split: 2024_03_07T12_32_41.726473
path:
- results_2024-03-07T12-32-41.726473.parquet
- split: latest
path:
- results_2024-03-07T12-32-41.726473.parquet
---
# Dataset Card for Evaluation run of ddyuudd/mistral_dmbr10_32_sig
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ddyuudd/mistral_dmbr10_32_sig](https://huggingface.co/ddyuudd/mistral_dmbr10_32_sig) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ddyuudd__mistral_dmbr10_32_sig",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-07T12:32:41.726473](https://huggingface.co/datasets/open-llm-leaderboard/details_ddyuudd__mistral_dmbr10_32_sig/blob/main/results_2024-03-07T12-32-41.726473.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6116504471153977,
"acc_stderr": 0.032943322563727395,
"acc_norm": 0.6166317430983218,
"acc_norm_stderr": 0.03361626974956549,
"mc1": 0.3011015911872705,
"mc1_stderr": 0.016058999026100612,
"mc2": 0.44335454039876326,
"mc2_stderr": 0.014873057460151717
},
"harness|arc:challenge|25": {
"acc": 0.5580204778156996,
"acc_stderr": 0.014512682523128342,
"acc_norm": 0.5861774744027304,
"acc_norm_stderr": 0.014392730009221005
},
"harness|hellaswag|10": {
"acc": 0.6259709221270663,
"acc_stderr": 0.004828822920915221,
"acc_norm": 0.8257319259111731,
"acc_norm_stderr": 0.0037856457412359483
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.631578947368421,
"acc_stderr": 0.03925523381052932,
"acc_norm": 0.631578947368421,
"acc_norm_stderr": 0.03925523381052932
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6641509433962264,
"acc_stderr": 0.029067220146644826,
"acc_norm": 0.6641509433962264,
"acc_norm_stderr": 0.029067220146644826
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7291666666666666,
"acc_stderr": 0.03716177437566017,
"acc_norm": 0.7291666666666666,
"acc_norm_stderr": 0.03716177437566017
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5664739884393064,
"acc_stderr": 0.03778621079092056,
"acc_norm": 0.5664739884393064,
"acc_norm_stderr": 0.03778621079092056
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.04784060704105652,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.04784060704105652
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5361702127659574,
"acc_stderr": 0.032600385118357715,
"acc_norm": 0.5361702127659574,
"acc_norm_stderr": 0.032600385118357715
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.046774730044912,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.046774730044912
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.02535574126305526,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.02535574126305526
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7258064516129032,
"acc_stderr": 0.025378139970885196,
"acc_norm": 0.7258064516129032,
"acc_norm_stderr": 0.025378139970885196
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.033744026441394036,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.033744026441394036
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.029376616484945627,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.029376616484945627
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8134715025906736,
"acc_stderr": 0.02811209121011746,
"acc_norm": 0.8134715025906736,
"acc_norm_stderr": 0.02811209121011746
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5743589743589743,
"acc_stderr": 0.025069094387296525,
"acc_norm": 0.5743589743589743,
"acc_norm_stderr": 0.025069094387296525
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131143,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131143
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6386554621848739,
"acc_stderr": 0.031204691225150016,
"acc_norm": 0.6386554621848739,
"acc_norm_stderr": 0.031204691225150016
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2781456953642384,
"acc_stderr": 0.03658603262763743,
"acc_norm": 0.2781456953642384,
"acc_norm_stderr": 0.03658603262763743
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7889908256880734,
"acc_stderr": 0.01749392240411265,
"acc_norm": 0.7889908256880734,
"acc_norm_stderr": 0.01749392240411265
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.028626547912437406,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.028626547912437406
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7468354430379747,
"acc_stderr": 0.02830465794303529,
"acc_norm": 0.7468354430379747,
"acc_norm_stderr": 0.02830465794303529
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6502242152466368,
"acc_stderr": 0.03200736719484503,
"acc_norm": 0.6502242152466368,
"acc_norm_stderr": 0.03200736719484503
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7404580152671756,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.7404580152671756,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7300613496932515,
"acc_stderr": 0.03487825168497892,
"acc_norm": 0.7300613496932515,
"acc_norm_stderr": 0.03487825168497892
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507332,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507332
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7918263090676884,
"acc_stderr": 0.014518592248904033,
"acc_norm": 0.7918263090676884,
"acc_norm_stderr": 0.014518592248904033
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.708092485549133,
"acc_stderr": 0.024476994076247333,
"acc_norm": 0.708092485549133,
"acc_norm_stderr": 0.024476994076247333
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.34972067039106147,
"acc_stderr": 0.015949308790233648,
"acc_norm": 0.34972067039106147,
"acc_norm_stderr": 0.015949308790233648
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.696078431372549,
"acc_stderr": 0.026336613469046626,
"acc_norm": 0.696078431372549,
"acc_norm_stderr": 0.026336613469046626
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818767,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818767
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6882716049382716,
"acc_stderr": 0.02577311116963046,
"acc_norm": 0.6882716049382716,
"acc_norm_stderr": 0.02577311116963046
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.42698826597131684,
"acc_stderr": 0.012633353557534425,
"acc_norm": 0.42698826597131684,
"acc_norm_stderr": 0.012633353557534425
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6397058823529411,
"acc_stderr": 0.02916312857067073,
"acc_norm": 0.6397058823529411,
"acc_norm_stderr": 0.02916312857067073
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6225490196078431,
"acc_stderr": 0.01961085147488029,
"acc_norm": 0.6225490196078431,
"acc_norm_stderr": 0.01961085147488029
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.046534298079135075,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.046534298079135075
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.689795918367347,
"acc_stderr": 0.029613459872484375,
"acc_norm": 0.689795918367347,
"acc_norm_stderr": 0.029613459872484375
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421606,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421606
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3011015911872705,
"mc1_stderr": 0.016058999026100612,
"mc2": 0.44335454039876326,
"mc2_stderr": 0.014873057460151717
},
"harness|winogrande|5": {
"acc": 0.7790055248618785,
"acc_stderr": 0.01166122363764341
},
"harness|gsm8k|5": {
"acc": 0.378316906747536,
"acc_stderr": 0.013358407831777115
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_cookinai__CM-14 | ---
pretty_name: Evaluation run of cookinai/CM-14
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [cookinai/CM-14](https://huggingface.co/cookinai/CM-14) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_cookinai__CM-14\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-10T15:39:56.317779](https://huggingface.co/datasets/open-llm-leaderboard/details_cookinai__CM-14/blob/main/results_2024-01-10T15-39-56.317779.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6580185798671865,\n\
\ \"acc_stderr\": 0.03192368037476644,\n \"acc_norm\": 0.6580152943319993,\n\
\ \"acc_norm_stderr\": 0.03258224944124306,\n \"mc1\": 0.45532435740514077,\n\
\ \"mc1_stderr\": 0.017433490102538765,\n \"mc2\": 0.6190301844108673,\n\
\ \"mc2_stderr\": 0.015232563824973148\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.659556313993174,\n \"acc_stderr\": 0.013847460518892978,\n\
\ \"acc_norm\": 0.6936860068259386,\n \"acc_norm_stderr\": 0.013470584417276513\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6870145389364668,\n\
\ \"acc_stderr\": 0.004627607991626914,\n \"acc_norm\": 0.8697470623381797,\n\
\ \"acc_norm_stderr\": 0.003358936279867257\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.037385206761196686,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.037385206761196686\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n\
\ \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"\
acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.049512182523962625,\n\
\ \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.049512182523962625\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816508,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816508\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5872340425531914,\n \"acc_stderr\": 0.03218471141400351,\n\
\ \"acc_norm\": 0.5872340425531914,\n \"acc_norm_stderr\": 0.03218471141400351\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42328042328042326,\n \"acc_stderr\": 0.025446365634406783,\n \"\
acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.025446365634406783\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n\
\ \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n\
\ \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n\
\ \"acc_stderr\": 0.023287665127268552,\n \"acc_norm\": 0.7870967741935484,\n\
\ \"acc_norm_stderr\": 0.023287665127268552\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n\
\ \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\"\
: 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657266,\n\
\ \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657266\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.02882088466625326,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.02882088466625326\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.03006676158297793,\n \
\ \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.03006676158297793\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8513761467889909,\n \"acc_stderr\": 0.015251253773660836,\n \"\
acc_norm\": 0.8513761467889909,\n \"acc_norm_stderr\": 0.015251253773660836\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5277777777777778,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.5277777777777778,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8382352941176471,\n\
\ \"acc_stderr\": 0.02584501798692692,\n \"acc_norm\": 0.8382352941176471,\n\
\ \"acc_norm_stderr\": 0.02584501798692692\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.025530100460233494,\n\
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.025530100460233494\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.031024411740572213,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.031024411740572213\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.034465133507525995,\n\
\ \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.034465133507525995\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990947,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990947\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.03226219377286775,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.03226219377286775\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8352490421455939,\n\
\ \"acc_stderr\": 0.013265346261323788,\n \"acc_norm\": 0.8352490421455939,\n\
\ \"acc_norm_stderr\": 0.013265346261323788\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7514450867052023,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.7514450867052023,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.43687150837988825,\n\
\ \"acc_stderr\": 0.01658868086453063,\n \"acc_norm\": 0.43687150837988825,\n\
\ \"acc_norm_stderr\": 0.01658868086453063\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.025360603796242553,\n\
\ \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.025360603796242553\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n\
\ \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n\
\ \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n \
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\"\
: 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \"\
acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4641460234680574,\n\
\ \"acc_stderr\": 0.012737361318730581,\n \"acc_norm\": 0.4641460234680574,\n\
\ \"acc_norm_stderr\": 0.012737361318730581\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6948529411764706,\n \"acc_stderr\": 0.0279715413701706,\n\
\ \"acc_norm\": 0.6948529411764706,\n \"acc_norm_stderr\": 0.0279715413701706\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6928104575163399,\n \"acc_stderr\": 0.01866335967146367,\n \
\ \"acc_norm\": 0.6928104575163399,\n \"acc_norm_stderr\": 0.01866335967146367\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128445,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128445\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454132,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454132\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.45532435740514077,\n\
\ \"mc1_stderr\": 0.017433490102538765,\n \"mc2\": 0.6190301844108673,\n\
\ \"mc2_stderr\": 0.015232563824973148\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8105761641673244,\n \"acc_stderr\": 0.011012790432989245\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7225170583775588,\n \
\ \"acc_stderr\": 0.012333447581047539\n }\n}\n```"
repo_url: https://huggingface.co/cookinai/CM-14
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_10T15_39_56.317779
path:
- '**/details_harness|arc:challenge|25_2024-01-10T15-39-56.317779.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-10T15-39-56.317779.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_10T15_39_56.317779
path:
- '**/details_harness|gsm8k|5_2024-01-10T15-39-56.317779.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-10T15-39-56.317779.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_10T15_39_56.317779
path:
- '**/details_harness|hellaswag|10_2024-01-10T15-39-56.317779.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-10T15-39-56.317779.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_10T15_39_56.317779
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T15-39-56.317779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-10T15-39-56.317779.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T15-39-56.317779.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_10T15_39_56.317779
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T15-39-56.317779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T15-39-56.317779.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_10T15_39_56.317779
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T15-39-56.317779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T15-39-56.317779.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_10T15_39_56.317779
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T15-39-56.317779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T15-39-56.317779.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_10T15_39_56.317779
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T15-39-56.317779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T15-39-56.317779.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_10T15_39_56.317779
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T15-39-56.317779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T15-39-56.317779.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_10T15_39_56.317779
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T15-39-56.317779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T15-39-56.317779.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_10T15_39_56.317779
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T15-39-56.317779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T15-39-56.317779.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_10T15_39_56.317779
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T15-39-56.317779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T15-39-56.317779.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_10T15_39_56.317779
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T15-39-56.317779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T15-39-56.317779.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_10T15_39_56.317779
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T15-39-56.317779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T15-39-56.317779.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_10T15_39_56.317779
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T15-39-56.317779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T15-39-56.317779.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_10T15_39_56.317779
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T15-39-56.317779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T15-39-56.317779.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_10T15_39_56.317779
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T15-39-56.317779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T15-39-56.317779.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_10T15_39_56.317779
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T15-39-56.317779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T15-39-56.317779.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_10T15_39_56.317779
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T15-39-56.317779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T15-39-56.317779.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_10T15_39_56.317779
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T15-39-56.317779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T15-39-56.317779.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_10T15_39_56.317779
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T15-39-56.317779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T15-39-56.317779.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_10T15_39_56.317779
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T15-39-56.317779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T15-39-56.317779.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_10T15_39_56.317779
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T15-39-56.317779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T15-39-56.317779.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_10T15_39_56.317779
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T15-39-56.317779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T15-39-56.317779.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_10T15_39_56.317779
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T15-39-56.317779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T15-39-56.317779.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_10T15_39_56.317779
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T15-39-56.317779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T15-39-56.317779.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_10T15_39_56.317779
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T15-39-56.317779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T15-39-56.317779.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_10T15_39_56.317779
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T15-39-56.317779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T15-39-56.317779.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_10T15_39_56.317779
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T15-39-56.317779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T15-39-56.317779.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_10T15_39_56.317779
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T15-39-56.317779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T15-39-56.317779.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_10T15_39_56.317779
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T15-39-56.317779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T15-39-56.317779.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_10T15_39_56.317779
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T15-39-56.317779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T15-39-56.317779.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_10T15_39_56.317779
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T15-39-56.317779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T15-39-56.317779.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_10T15_39_56.317779
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T15-39-56.317779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T15-39-56.317779.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_10T15_39_56.317779
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T15-39-56.317779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T15-39-56.317779.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_10T15_39_56.317779
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T15-39-56.317779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T15-39-56.317779.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_10T15_39_56.317779
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T15-39-56.317779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T15-39-56.317779.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_10T15_39_56.317779
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T15-39-56.317779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T15-39-56.317779.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_10T15_39_56.317779
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-10T15-39-56.317779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-10T15-39-56.317779.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_10T15_39_56.317779
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T15-39-56.317779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T15-39-56.317779.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_10T15_39_56.317779
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T15-39-56.317779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T15-39-56.317779.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_10T15_39_56.317779
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T15-39-56.317779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T15-39-56.317779.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_10T15_39_56.317779
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-10T15-39-56.317779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-10T15-39-56.317779.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_10T15_39_56.317779
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-10T15-39-56.317779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-10T15-39-56.317779.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_10T15_39_56.317779
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T15-39-56.317779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T15-39-56.317779.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_10T15_39_56.317779
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T15-39-56.317779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T15-39-56.317779.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_10T15_39_56.317779
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T15-39-56.317779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T15-39-56.317779.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_10T15_39_56.317779
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T15-39-56.317779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T15-39-56.317779.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_10T15_39_56.317779
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T15-39-56.317779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T15-39-56.317779.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_10T15_39_56.317779
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T15-39-56.317779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T15-39-56.317779.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_10T15_39_56.317779
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T15-39-56.317779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T15-39-56.317779.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_10T15_39_56.317779
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T15-39-56.317779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T15-39-56.317779.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_10T15_39_56.317779
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T15-39-56.317779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T15-39-56.317779.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_10T15_39_56.317779
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T15-39-56.317779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T15-39-56.317779.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_10T15_39_56.317779
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T15-39-56.317779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T15-39-56.317779.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_10T15_39_56.317779
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T15-39-56.317779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T15-39-56.317779.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_10T15_39_56.317779
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T15-39-56.317779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T15-39-56.317779.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_10T15_39_56.317779
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-10T15-39-56.317779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-10T15-39-56.317779.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_10T15_39_56.317779
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T15-39-56.317779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T15-39-56.317779.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_10T15_39_56.317779
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-10T15-39-56.317779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-10T15-39-56.317779.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_10T15_39_56.317779
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T15-39-56.317779.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T15-39-56.317779.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_10T15_39_56.317779
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-10T15-39-56.317779.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-10T15-39-56.317779.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_10T15_39_56.317779
path:
- '**/details_harness|winogrande|5_2024-01-10T15-39-56.317779.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-10T15-39-56.317779.parquet'
- config_name: results
data_files:
- split: 2024_01_10T15_39_56.317779
path:
- results_2024-01-10T15-39-56.317779.parquet
- split: latest
path:
- results_2024-01-10T15-39-56.317779.parquet
---
# Dataset Card for Evaluation run of cookinai/CM-14
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [cookinai/CM-14](https://huggingface.co/cookinai/CM-14) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_cookinai__CM-14",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-10T15:39:56.317779](https://huggingface.co/datasets/open-llm-leaderboard/details_cookinai__CM-14/blob/main/results_2024-01-10T15-39-56.317779.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6580185798671865,
"acc_stderr": 0.03192368037476644,
"acc_norm": 0.6580152943319993,
"acc_norm_stderr": 0.03258224944124306,
"mc1": 0.45532435740514077,
"mc1_stderr": 0.017433490102538765,
"mc2": 0.6190301844108673,
"mc2_stderr": 0.015232563824973148
},
"harness|arc:challenge|25": {
"acc": 0.659556313993174,
"acc_stderr": 0.013847460518892978,
"acc_norm": 0.6936860068259386,
"acc_norm_stderr": 0.013470584417276513
},
"harness|hellaswag|10": {
"acc": 0.6870145389364668,
"acc_stderr": 0.004627607991626914,
"acc_norm": 0.8697470623381797,
"acc_norm_stderr": 0.003358936279867257
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.037385206761196686,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.037385206761196686
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.720754716981132,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.720754716981132,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.049512182523962625,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.049512182523962625
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816508,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816508
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5872340425531914,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.5872340425531914,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42328042328042326,
"acc_stderr": 0.025446365634406783,
"acc_norm": 0.42328042328042326,
"acc_norm_stderr": 0.025446365634406783
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.023287665127268552,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.023287665127268552
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586815,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6717948717948717,
"acc_stderr": 0.023807633198657266,
"acc_norm": 0.6717948717948717,
"acc_norm_stderr": 0.023807633198657266
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.02882088466625326,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.02882088466625326
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.03006676158297793,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.03006676158297793
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8513761467889909,
"acc_stderr": 0.015251253773660836,
"acc_norm": 0.8513761467889909,
"acc_norm_stderr": 0.015251253773660836
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.02584501798692692,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.02584501798692692
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.025530100460233494,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.025530100460233494
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.031024411740572213,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.031024411740572213
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.034465133507525995,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.034465133507525995
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990947,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990947
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.03226219377286775,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.03226219377286775
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8352490421455939,
"acc_stderr": 0.013265346261323788,
"acc_norm": 0.8352490421455939,
"acc_norm_stderr": 0.013265346261323788
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7514450867052023,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.7514450867052023,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.43687150837988825,
"acc_stderr": 0.01658868086453063,
"acc_norm": 0.43687150837988825,
"acc_norm_stderr": 0.01658868086453063
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7320261437908496,
"acc_stderr": 0.025360603796242553,
"acc_norm": 0.7320261437908496,
"acc_norm_stderr": 0.025360603796242553
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.75,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.75,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4641460234680574,
"acc_stderr": 0.012737361318730581,
"acc_norm": 0.4641460234680574,
"acc_norm_stderr": 0.012737361318730581
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6948529411764706,
"acc_stderr": 0.0279715413701706,
"acc_norm": 0.6948529411764706,
"acc_norm_stderr": 0.0279715413701706
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6928104575163399,
"acc_stderr": 0.01866335967146367,
"acc_norm": 0.6928104575163399,
"acc_norm_stderr": 0.01866335967146367
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128445,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128445
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454132,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454132
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197769,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197769
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.45532435740514077,
"mc1_stderr": 0.017433490102538765,
"mc2": 0.6190301844108673,
"mc2_stderr": 0.015232563824973148
},
"harness|winogrande|5": {
"acc": 0.8105761641673244,
"acc_stderr": 0.011012790432989245
},
"harness|gsm8k|5": {
"acc": 0.7225170583775588,
"acc_stderr": 0.012333447581047539
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_jambroz__sixtyoneeighty-7b-dpo | ---
pretty_name: Evaluation run of jambroz/sixtyoneeighty-7b-dpo
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jambroz/sixtyoneeighty-7b-dpo](https://huggingface.co/jambroz/sixtyoneeighty-7b-dpo)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jambroz__sixtyoneeighty-7b-dpo\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-06T00:35:12.303705](https://huggingface.co/datasets/open-llm-leaderboard/details_jambroz__sixtyoneeighty-7b-dpo/blob/main/results_2024-04-06T00-35-12.303705.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6521876102897978,\n\
\ \"acc_stderr\": 0.03215157161371772,\n \"acc_norm\": 0.6535764590001866,\n\
\ \"acc_norm_stderr\": 0.03280152628146147,\n \"mc1\": 0.5152998776009792,\n\
\ \"mc1_stderr\": 0.0174953044731879,\n \"mc2\": 0.6764186293436338,\n\
\ \"mc2_stderr\": 0.014863055080858209\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6655290102389079,\n \"acc_stderr\": 0.013787460322441374,\n\
\ \"acc_norm\": 0.6868600682593856,\n \"acc_norm_stderr\": 0.013552671543623496\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6803425612427804,\n\
\ \"acc_stderr\": 0.004653907471785642,\n \"acc_norm\": 0.864070902210715,\n\
\ \"acc_norm_stderr\": 0.0034201285023462203\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322666,\n\
\ \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322666\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7361111111111112,\n\
\ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.7361111111111112,\n\
\ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n\
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.049512182523962625,\n\
\ \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.049512182523962625\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n\
\ \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41005291005291006,\n \"acc_stderr\": 0.025331202438944433,\n \"\
acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.025331202438944433\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5158730158730159,\n\
\ \"acc_stderr\": 0.044698818540726076,\n \"acc_norm\": 0.5158730158730159,\n\
\ \"acc_norm_stderr\": 0.044698818540726076\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7935483870967742,\n\
\ \"acc_stderr\": 0.02302589961718871,\n \"acc_norm\": 0.7935483870967742,\n\
\ \"acc_norm_stderr\": 0.02302589961718871\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175008,\n\
\ \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175008\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768766,\n\
\ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768766\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.023854795680971128,\n\
\ \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.023854795680971128\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.02874204090394848,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.02874204090394848\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6974789915966386,\n \"acc_stderr\": 0.029837962388291932,\n\
\ \"acc_norm\": 0.6974789915966386,\n \"acc_norm_stderr\": 0.029837962388291932\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8330275229357799,\n \"acc_stderr\": 0.01599015488507338,\n \"\
acc_norm\": 0.8330275229357799,\n \"acc_norm_stderr\": 0.01599015488507338\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5601851851851852,\n \"acc_stderr\": 0.033851779760448106,\n \"\
acc_norm\": 0.5601851851851852,\n \"acc_norm_stderr\": 0.033851779760448106\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931045,\n \"\
acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931045\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601436,\n \
\ \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601436\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n\
\ \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.020588491316092375,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.020588491316092375\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8326947637292464,\n\
\ \"acc_stderr\": 0.013347327202920332,\n \"acc_norm\": 0.8326947637292464,\n\
\ \"acc_norm_stderr\": 0.013347327202920332\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.023948512905468355,\n\
\ \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.023948512905468355\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4145251396648045,\n\
\ \"acc_stderr\": 0.016476342210254,\n \"acc_norm\": 0.4145251396648045,\n\
\ \"acc_norm_stderr\": 0.016476342210254\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.02545775669666788,\n\
\ \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.02545775669666788\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\
\ \"acc_stderr\": 0.025922371788818763,\n \"acc_norm\": 0.7041800643086816,\n\
\ \"acc_norm_stderr\": 0.025922371788818763\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n \
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\"\
: 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \"\
acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4661016949152542,\n\
\ \"acc_stderr\": 0.01274085387294983,\n \"acc_norm\": 0.4661016949152542,\n\
\ \"acc_norm_stderr\": 0.01274085387294983\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.028332959514031208,\n\
\ \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.028332959514031208\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.673202614379085,\n \"acc_stderr\": 0.018975427920507208,\n \
\ \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.018975427920507208\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.746938775510204,\n \"acc_stderr\": 0.027833023871399677,\n\
\ \"acc_norm\": 0.746938775510204,\n \"acc_norm_stderr\": 0.027833023871399677\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.025538433368578323,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.025538433368578323\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.02796678585916089,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.02796678585916089\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5152998776009792,\n\
\ \"mc1_stderr\": 0.0174953044731879,\n \"mc2\": 0.6764186293436338,\n\
\ \"mc2_stderr\": 0.014863055080858209\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8050513022888713,\n \"acc_stderr\": 0.011134099415938278\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6224412433661866,\n \
\ \"acc_stderr\": 0.01335315066635854\n }\n}\n```"
repo_url: https://huggingface.co/jambroz/sixtyoneeighty-7b-dpo
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_06T00_35_12.303705
path:
- '**/details_harness|arc:challenge|25_2024-04-06T00-35-12.303705.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-06T00-35-12.303705.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_06T00_35_12.303705
path:
- '**/details_harness|gsm8k|5_2024-04-06T00-35-12.303705.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-06T00-35-12.303705.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_06T00_35_12.303705
path:
- '**/details_harness|hellaswag|10_2024-04-06T00-35-12.303705.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-06T00-35-12.303705.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_06T00_35_12.303705
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-06T00-35-12.303705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-06T00-35-12.303705.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-06T00-35-12.303705.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_06T00_35_12.303705
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-06T00-35-12.303705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-06T00-35-12.303705.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_06T00_35_12.303705
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-06T00-35-12.303705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-06T00-35-12.303705.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_06T00_35_12.303705
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-06T00-35-12.303705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-06T00-35-12.303705.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_06T00_35_12.303705
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-06T00-35-12.303705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-06T00-35-12.303705.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_06T00_35_12.303705
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-06T00-35-12.303705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-06T00-35-12.303705.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_06T00_35_12.303705
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-06T00-35-12.303705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-06T00-35-12.303705.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_06T00_35_12.303705
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-06T00-35-12.303705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-06T00-35-12.303705.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_06T00_35_12.303705
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-06T00-35-12.303705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-06T00-35-12.303705.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_06T00_35_12.303705
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-06T00-35-12.303705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-06T00-35-12.303705.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_06T00_35_12.303705
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-06T00-35-12.303705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-06T00-35-12.303705.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_06T00_35_12.303705
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-06T00-35-12.303705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-06T00-35-12.303705.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_06T00_35_12.303705
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-06T00-35-12.303705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-06T00-35-12.303705.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_06T00_35_12.303705
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-06T00-35-12.303705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-06T00-35-12.303705.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_06T00_35_12.303705
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-06T00-35-12.303705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-06T00-35-12.303705.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_06T00_35_12.303705
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-06T00-35-12.303705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-06T00-35-12.303705.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_06T00_35_12.303705
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-06T00-35-12.303705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-06T00-35-12.303705.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_06T00_35_12.303705
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-06T00-35-12.303705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-06T00-35-12.303705.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_06T00_35_12.303705
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-06T00-35-12.303705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-06T00-35-12.303705.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_06T00_35_12.303705
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-06T00-35-12.303705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-06T00-35-12.303705.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_06T00_35_12.303705
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-06T00-35-12.303705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-06T00-35-12.303705.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_06T00_35_12.303705
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-06T00-35-12.303705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-06T00-35-12.303705.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_06T00_35_12.303705
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-06T00-35-12.303705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-06T00-35-12.303705.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_06T00_35_12.303705
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-06T00-35-12.303705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-06T00-35-12.303705.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_06T00_35_12.303705
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-06T00-35-12.303705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-06T00-35-12.303705.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_06T00_35_12.303705
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-06T00-35-12.303705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-06T00-35-12.303705.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_06T00_35_12.303705
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-06T00-35-12.303705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-06T00-35-12.303705.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_06T00_35_12.303705
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-06T00-35-12.303705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-06T00-35-12.303705.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_06T00_35_12.303705
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-06T00-35-12.303705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-06T00-35-12.303705.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_06T00_35_12.303705
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-06T00-35-12.303705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-06T00-35-12.303705.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_06T00_35_12.303705
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-06T00-35-12.303705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-06T00-35-12.303705.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_06T00_35_12.303705
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-06T00-35-12.303705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-06T00-35-12.303705.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_06T00_35_12.303705
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-06T00-35-12.303705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-06T00-35-12.303705.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_06T00_35_12.303705
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-06T00-35-12.303705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-06T00-35-12.303705.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_06T00_35_12.303705
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-06T00-35-12.303705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-06T00-35-12.303705.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_06T00_35_12.303705
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-06T00-35-12.303705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-06T00-35-12.303705.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_06T00_35_12.303705
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-06T00-35-12.303705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-06T00-35-12.303705.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_06T00_35_12.303705
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-06T00-35-12.303705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-06T00-35-12.303705.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_06T00_35_12.303705
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-06T00-35-12.303705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-06T00-35-12.303705.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_06T00_35_12.303705
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-06T00-35-12.303705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-06T00-35-12.303705.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_06T00_35_12.303705
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-06T00-35-12.303705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-06T00-35-12.303705.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_06T00_35_12.303705
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-06T00-35-12.303705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-06T00-35-12.303705.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_06T00_35_12.303705
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-06T00-35-12.303705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-06T00-35-12.303705.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_06T00_35_12.303705
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-06T00-35-12.303705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-06T00-35-12.303705.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_06T00_35_12.303705
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-06T00-35-12.303705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-06T00-35-12.303705.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_06T00_35_12.303705
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-06T00-35-12.303705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-06T00-35-12.303705.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_06T00_35_12.303705
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-06T00-35-12.303705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-06T00-35-12.303705.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_06T00_35_12.303705
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-06T00-35-12.303705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-06T00-35-12.303705.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_06T00_35_12.303705
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-06T00-35-12.303705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-06T00-35-12.303705.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_06T00_35_12.303705
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-06T00-35-12.303705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-06T00-35-12.303705.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_06T00_35_12.303705
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-06T00-35-12.303705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-06T00-35-12.303705.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_06T00_35_12.303705
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-06T00-35-12.303705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-06T00-35-12.303705.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_06T00_35_12.303705
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-06T00-35-12.303705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-06T00-35-12.303705.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_06T00_35_12.303705
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-06T00-35-12.303705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-06T00-35-12.303705.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_06T00_35_12.303705
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-06T00-35-12.303705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-06T00-35-12.303705.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_06T00_35_12.303705
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-06T00-35-12.303705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-06T00-35-12.303705.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_06T00_35_12.303705
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-06T00-35-12.303705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-06T00-35-12.303705.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_06T00_35_12.303705
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-06T00-35-12.303705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-06T00-35-12.303705.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_06T00_35_12.303705
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-06T00-35-12.303705.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-06T00-35-12.303705.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_06T00_35_12.303705
path:
- '**/details_harness|winogrande|5_2024-04-06T00-35-12.303705.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-06T00-35-12.303705.parquet'
- config_name: results
data_files:
- split: 2024_04_06T00_35_12.303705
path:
- results_2024-04-06T00-35-12.303705.parquet
- split: latest
path:
- results_2024-04-06T00-35-12.303705.parquet
---
# Dataset Card for Evaluation run of jambroz/sixtyoneeighty-7b-dpo
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [jambroz/sixtyoneeighty-7b-dpo](https://huggingface.co/jambroz/sixtyoneeighty-7b-dpo) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jambroz__sixtyoneeighty-7b-dpo",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-06T00:35:12.303705](https://huggingface.co/datasets/open-llm-leaderboard/details_jambroz__sixtyoneeighty-7b-dpo/blob/main/results_2024-04-06T00-35-12.303705.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6521876102897978,
"acc_stderr": 0.03215157161371772,
"acc_norm": 0.6535764590001866,
"acc_norm_stderr": 0.03280152628146147,
"mc1": 0.5152998776009792,
"mc1_stderr": 0.0174953044731879,
"mc2": 0.6764186293436338,
"mc2_stderr": 0.014863055080858209
},
"harness|arc:challenge|25": {
"acc": 0.6655290102389079,
"acc_stderr": 0.013787460322441374,
"acc_norm": 0.6868600682593856,
"acc_norm_stderr": 0.013552671543623496
},
"harness|hellaswag|10": {
"acc": 0.6803425612427804,
"acc_stderr": 0.004653907471785642,
"acc_norm": 0.864070902210715,
"acc_norm_stderr": 0.0034201285023462203
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322666,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322666
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7361111111111112,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.7361111111111112,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.049512182523962625,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.049512182523962625
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932263,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932263
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.025331202438944433,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.025331202438944433
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5158730158730159,
"acc_stderr": 0.044698818540726076,
"acc_norm": 0.5158730158730159,
"acc_norm_stderr": 0.044698818540726076
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7935483870967742,
"acc_stderr": 0.02302589961718871,
"acc_norm": 0.7935483870967742,
"acc_norm_stderr": 0.02302589961718871
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175008,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768766,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768766
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6692307692307692,
"acc_stderr": 0.023854795680971128,
"acc_norm": 0.6692307692307692,
"acc_norm_stderr": 0.023854795680971128
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.02874204090394848,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.02874204090394848
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6974789915966386,
"acc_stderr": 0.029837962388291932,
"acc_norm": 0.6974789915966386,
"acc_norm_stderr": 0.029837962388291932
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8330275229357799,
"acc_stderr": 0.01599015488507338,
"acc_norm": 0.8330275229357799,
"acc_norm_stderr": 0.01599015488507338
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5601851851851852,
"acc_stderr": 0.033851779760448106,
"acc_norm": 0.5601851851851852,
"acc_norm_stderr": 0.033851779760448106
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026156867523931045,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026156867523931045
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601436,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601436
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092375,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092375
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8326947637292464,
"acc_stderr": 0.013347327202920332,
"acc_norm": 0.8326947637292464,
"acc_norm_stderr": 0.013347327202920332
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.023948512905468355,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.023948512905468355
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4145251396648045,
"acc_stderr": 0.016476342210254,
"acc_norm": 0.4145251396648045,
"acc_norm_stderr": 0.016476342210254
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.02545775669666788,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.02545775669666788
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818763,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818763
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.75,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.75,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.029820747191422473,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.029820747191422473
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4661016949152542,
"acc_stderr": 0.01274085387294983,
"acc_norm": 0.4661016949152542,
"acc_norm_stderr": 0.01274085387294983
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.028332959514031208,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.028332959514031208
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.018975427920507208,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.018975427920507208
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.746938775510204,
"acc_stderr": 0.027833023871399677,
"acc_norm": 0.746938775510204,
"acc_norm_stderr": 0.027833023871399677
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578323,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578323
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.02796678585916089,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.02796678585916089
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5152998776009792,
"mc1_stderr": 0.0174953044731879,
"mc2": 0.6764186293436338,
"mc2_stderr": 0.014863055080858209
},
"harness|winogrande|5": {
"acc": 0.8050513022888713,
"acc_stderr": 0.011134099415938278
},
"harness|gsm8k|5": {
"acc": 0.6224412433661866,
"acc_stderr": 0.01335315066635854
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
neural-commons/ml_word_list | ---
dataset_info:
features:
- name: words
dtype: string
splits:
- name: train
num_bytes: 1346373
num_examples: 100000
- name: test
num_bytes: 1346237
num_examples: 100000
download_size: 2617143
dataset_size: 2692610
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
iahlt/he-universal_morphologies | ---
dataset_info:
features:
- name: lemma
dtype: string
- name: forms
sequence:
- name: word
dtype: string
- name: Aktionsart
sequence:
class_label:
names:
'0': STAT
'1': DYN
'2': TEL
'3': ATEL
'4': PCT
'5': DUR
'6': ACH
'7': ACCMP
'8': SEMEL
'9': ACTY
- name: Animacy
sequence:
class_label:
names:
'0': ANIM
'1': INAN
'2': HUM
'3': NHUM
- name: Argument_Marking
sequence:
class_label:
names:
'0': ARGNO1S
'1': ARGNO2S
'2': ARGNO3S
'3': ARGNO1P
'4': ARGNO2P
'5': ARGNO3P
'6': ARGAC1S
'7': ARGAC2S
'8': ARGAC3S
'9': ARGAC1P
'10': ARGAC2P
'11': ARGAC3P
'12': ARGAB1S
'13': ARGAB2S
'14': ARGAB3S
'15': ARGAB1P
'16': ARGAB2P
'17': ARGAB3P
'18': ARGER1S
'19': ARGER2S
'20': ARGER3S
'21': ARGER1P
'22': ARGER2P
'23': ARGER3P
'24': ARGDA1S
'25': ARGDA2S
'26': ARGDA3S
'27': ARGDA1P
'28': ARGDA2P
'29': ARGDA3P
'30': ARGBE1S
'31': ARGBE2S
'32': ARGBE3S
'33': ARGBE1P
'34': ARGBE2P
'35': ARGBE3P
- name: Aspect
sequence:
class_label:
names:
'0': IPFV
'1': PFV
'2': PRF
'3': PROG
'4': PROSP
'5': ITER
'6': HAB
- name: Case
sequence:
class_label:
names:
'0': NOM
'1': ACC
'2': ERG
'3': ABS
'4': NOMS
'5': DAT
'6': BEN
'7': PRP
'8': GEN
'9': REL
'10': PRT
'11': INS
'12': COM
'13': VOC
'14': COMPV
'15': EQTV
'16': PRIV
'17': PROPR
'18': AVR
'19': FRML
'20': TRANS
'21': BYWAY
'22': INTER
'23': AT
'24': POST
'25': IN
'26': CIRC
'27': ANTE
'28': APUD
'29': 'ON'
'30': ONHR
'31': ONVR
'32': SUB
'33': REM
'34': PROXM
'35': ESS
'36': ALL
'37': ABL
'38': APPRX
'39': TERM
- name: Comparison
sequence:
class_label:
names:
'0': CMPR
'1': SPRL
'2': AB
'3': RL
'4': EQT
- name: Definiteness
sequence:
class_label:
names:
'0': DEF
'1': INDF
'2': SPEC
'3': NSPEC
- name: Deixis
sequence:
class_label:
names:
'0': PROX
'1': MED
'2': REMT
'3': REF1
'4': REF2
'5': NOREF
'6': PHOR
'7': VIS
'8': NVIS
'9': ABV
'10': EVEN
'11': BEL
- name: Evidentiality
sequence:
class_label:
names:
'0': FH
'1': DRCT
'2': SEN
'3': VISU
'4': NVSEN
'5': AUD
'6': NFH
'7': QUOT
'8': RPRT
'9': HRSY
'10': INFER
'11': ASSUM
- name: Finiteness
sequence:
class_label:
names:
'0': FIN
'1': NFIN
- name: Gender
sequence:
class_label:
names:
'0': MASC
'1': FEM
'2': NEUT
'3': NAKH1
'4': NAKH2
'5': NAKH3
'6': NAKH4
'7': NAKH5
'8': NAKH6
'9': NAKH7
'10': NAKH8
'11': BANTU1
'12': BANTU2
'13': BANTU3
'14': BANTU4
'15': BANTU5
'16': BANTU6
'17': BANTU7
'18': BANTU8
'19': BANTU9
'20': BANTU10
'21': BANTU11
'22': BANTU12
'23': BANTU13
'24': BANTU14
'25': BANTU15
'26': BANTU16
'27': BANTU17
'28': BANTU18
'29': BANTU19
'30': BANTU20
'31': BANTU21
'32': BANTU22
'33': BANTU23
- name: Information_Structure
sequence:
class_label:
names:
'0': TOP
'1': FOC
- name: Interrogativity
sequence:
class_label:
names:
'0': DECL
'1': INT
- name: Language_Specific
sequence:
class_label:
names:
'0': LGSPEC1
'1': LGSPEC2
'2': LGSPEC3
'3': LGSPEC4
'4': LGSPEC5
'5': LGSPEC6
'6': LGSPEC7
'7': LGSPEC8
'8': LGSPEC9
'9': LGSPEC10
- name: Mood
sequence:
class_label:
names:
'0': IND
'1': SBJV
'2': REAL
'3': IRR
'4': AUPRP
'5': AUNPRP
'6': IMP
'7': COND
'8': PURP
'9': INTEN
'10': POT
'11': LKLY
'12': ADM
'13': OBLIG
'14': DEB
'15': PERM
'16': DED
'17': SIM
'18': OPT
- name: Number
sequence:
class_label:
names:
'0': SG
'1': PL
'2': GRPL
'3': DU
'4': TRI
'5': PAUC
'6': GRPAUC
'7': INVN
- name: Part_Of_Speech
sequence:
class_label:
names:
'0': 'N'
'1': PROPN
'2': ADJ
'3': PRO
'4': CLF
'5': ART
'6': DET
'7': V
'8': ADV
'9': AUX
'10': V.PTCP
'11': V.MSDR
'12': V.CVB
'13': ADP
'14': COMP
'15': CONJ
'16': NUM
'17': PART
'18': INTJ
- name: Person
sequence:
class_label:
names:
'0': '0'
'1': '1'
'2': '2'
'3': '3'
'4': '4'
'5': INCL
'6': EXCL
'7': PRX
'8': OBV
- name: Polarity
sequence:
class_label:
names:
'0': POS
'1': NEG
- name: Politeness
sequence:
class_label:
names:
'0': INFM
'1': FORM
'2': ELEV
'3': HUMB
'4': POL
'5': AVOID
'6': LOW
'7': HIGH
'8': STELEV
'9': STSUPR
'10': LIT
'11': FOREG
'12': COL
- name: Possession
sequence:
class_label:
names:
'0': ALN
'1': NALN
'2': PSS1S
'3': PSS2S
'4': PSS2SF
'5': PSS2SM
'6': PSS2SINFM
'7': PSS2SFORM
'8': PSS3S
'9': PSS3SF
'10': PSS3SM
'11': PSS1D
'12': PSS1DI
'13': PSS1DE
'14': PSS2D
'15': PSS2DM
'16': PSS2DF
'17': PSS3D
'18': PSS3DF
'19': PSS3DM
'20': PSS1P
'21': PSS1PI
'22': PSS1PE
'23': PSS2P
'24': PSS2PF
'25': PSS2PM
'26': PSS3PF
'27': PSS3PM
- name: Switch_Reference
sequence:
class_label:
names:
'0': SS
'1': SSADV
'2': DS
'3': DSADV
'4': OR
'5': SIMMA
'6': SEQMA
'7': LOG
- name: Tense
sequence:
class_label:
names:
'0': PRS
'1': PST
'2': FUT
'3': IMMED
'4': HOD
'5': 1DAY
'6': RCT
'7': RMT
- name: Valency
sequence:
class_label:
names:
'0': IMPRS
'1': INTR
'2': TR
'3': DITR
'4': REFL
'5': RECP
'6': CAUS
'7': APPL
- name: Voice
sequence:
class_label:
names:
'0': ACT
'1': MID
'2': PASS
'3': ANTIP
'4': DIR
'5': INV
'6': AGFOC
'7': PFOC
'8': LFOC
'9': BFOC
'10': ACFOC
'11': IFOC
'12': CFOC
- name: Other
sequence: string
splits:
- name: train
num_bytes: 5258481
num_examples: 1176
download_size: 302856
dataset_size: 5258481
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
language:
- he
--- |
antolin/tlc_interduplication | ---
dataset_info:
features:
- name: id_within_dataset
dtype: int64
- name: snippet
dtype: string
- name: tokens
sequence: string
- name: nl
dtype: string
- name: split_within_dataset
dtype: string
- name: is_duplicated
dtype: bool
splits:
- name: train
num_bytes: 70652063.18677872
num_examples: 53327
- name: test
num_bytes: 8799876.304434607
num_examples: 6642
- name: valid
num_bytes: 8831673.508786675
num_examples: 6666
download_size: 33772946
dataset_size: 88283613.00000001
---
# Dataset Card for "tlc_interduplication"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sazirarrwth99/last_stage_dataset_kangoroo_test | ---
dataset_info:
features:
- name: text
dtype: string
- name: possible_classes
dtype: string
- name: possible_properties
dtype: string
- name: subgraph
dtype: string
- name: triplets
dtype: string
- name: origin
dtype: string
splits:
- name: train
num_bytes: 4007308
num_examples: 1157
download_size: 829283
dataset_size: 4007308
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
zxbsmk/webnovel_cn | ---
license: mit
task_categories:
- text2text-generation
language:
- zh
size_categories:
- 10M<n<100M
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
---
## 内容
包含从12560本网文提取的约**21.7M**条可用于训练小说生成的中文指令数据(novel_json_tokens512.zip)。~~下载链接:https://pan.baidu.com/s/1TorBMbrqxrn6odRF0PJBVw
提取码:jlh3~~
以及从中提取出的包含**50k**条数据的子集(novel_cn_token512_50k.json)。其中输入和输出都不多于 512 tokens。
## 样例
在原有小说文本基础上,依据下列五种指令生成数据。
其中,文本由小说中随机抽取的连续句子组成。
1. 给定标题,直接生成简介。
2. 给定标题和简介,生成开头。
3. 给定简介和一段文本,生成后续文本。
4. 给定标题和一段文本,生成后续文本。
5. 给定一段文本,生成后续文本。
```
{
"instruction": "小说名:无限恐怖\n节选正文:\n“不行,中校,我们必须把这里的情况和这些虫族的拍摄画面传回地球去,是的,我们人类已经到了最危险的关头,地球上所有的工业重工业完全应该按照战时情况进行任务布置,我们人类的工业力根本不是虫族能够想象的,一年,不,半年我们就能有一万艘宇宙战舰升空,如果全球一共进行建造的话,我们一定……”一名少校参谋长看着那密密麻麻的宇宙虫海,他神色激动的对楚轩说道。\n楚轩默默咬了一口苹果,他说道:“问题一……一万艘宇宙战舰需要多少熟练的战舰船员?需要多少士兵在战舰上服役,半年里我们可以聚集齐全吗?问题二……虫族会给地球联邦政府留下半年的时间吗?一旦我们被消灭干净,则联邦政府将会随即遭受到这些宇宙虫族的太空袭击,直到我们的太空优势完全丧失之后,虫族就能放心移动陨石进行攻击,甚至是将大量陆地虫子带上地球,那时我们人类能够凭肉体与虫子对抗吗?问题三……我们现在该如何把信息传给地球?”\n这名参谋长顿时就不说话了,而旁边一名上尉忽然说道:“在探戈幽灵星上的七座要塞中,每一座要塞都有大功率粒子喷射联络器,这种联络器连我们自己的科技都无法屏蔽,我想虫子也应该不可能继续屏蔽吧,毕竟那只是粒子而已,不是电流信号这些……”续篇:",
"input": "",
"output": "楚轩咬了一口苹果说道:“没错,所以我们才要登陆到探戈幽灵星,这是我们的目的,同时为了保存宇宙舰队的战斗力,以最大的平原军事基地,风暴要塞为中心,所有舰队都要降落到地面上,然后层层布防,在援军到来前,在地球联邦政府做好了战争准备前,我们必须像颗钉子一样钉死在这里,让虫族无法忽视我们而去进攻地球……怎么样?我们很可能会死在这里,面对数十万,数百万,甚至数千万的虫子前扑后续的攻击,但是我们的名字一定会载入史册……把刚才记录下来的我的话发向所有舰队,然后告诉他们,作为逃命被自己人杀掉,或者是成为英雄回到地球,让他们明白我们的处境,然后自己去选择吧。”\n周围的参谋和军官们同时向楚轩四人敬了一礼,接着那参谋长激动的向某处平台走了过去。\n郑吒默默走到了楚轩身边,他小声的说道:“楚轩……你什么时候变得了那么会煽情了啊?”\n楚轩却是理所当然的说道:“将自己归于多数人的一边,以前的你不是这样评价我的吗?没错,将自己归于多数人的一边,这是做任何大事都要先完成的第一步……已经让他们知道命运和我们连接在了一起,接着就只需要好好的安排下局面与等待‘主神’的任务就可以了,时间还有三天……”\n时间还有三天,在当天中午的时候,舰队群的预警舰果然发现了在探戈幽灵星后方徘徊着另一颗巨大圆球,它仿佛卫星一样座落在探戈幽灵星的近地轨道上,而随着联合舰队的到来,这只巨大圆球上果然也飞出了数以万计的宇宙虫子,这下联合舰队果然却如楚轩的预言那般了,前有埋伏,后有追兵,唯一的一条路就只剩下降落到探戈幽灵星上了。"
},
{
"instruction": "给定小说简介和节选,续写小说",
"input": "小说简介:主人公郑吒自从失去了自己最亲密的青梅竹马后,对这种反复而又无聊的现代生活已经感到十分的厌倦。正在这时,他发现电脑屏幕上弹出了一段信息:“想明白生命的意义吗?想真正的……活着吗?”在按下YES后,一切都改变了。他进入了一个恐怖片的轮回世界——主神空间……在主神空间里,只有不停地变强、不停地进化,才能闯过那一关关的恐怖片,才能活下去。郑吒,怎样才能活下去?是杀死与自己对抗的所有人,走向孤独的王者之道?还是和自己的伙伴奋战到死,以仁义之道来度过劫难?其实,所有的一切也只是为了活下去。直到他们发现了主神空间的秘密……究竟……谁才是暗中真正的威胁?一切尽在无限恐怖!\n\n\n\n上半部:初始\n节选篇章:“什么叫作事情已经发展到了这一步!”郑吒大声的喧哗吼叫着,他冲着楚轩吼道:“你这个白痴!我一直都在重复着同样一句话,我们是同伴啊,为什么不可以事先把你的猜测或者布局告诉我们呢?每次都把我当成白痴一样,每次都在算计我之后轻描淡写的说一句我骗你的,你究竟把我当成什么人?”\n“呃……大概是可以欺骗的凡人智慧者吧。”楚轩想了一下直接说道。\n“……不要拦着我!王侠,伊莫顿,你们放开我!我要干掉这个白痴!”\n喧哗又一次发生了,片刻之后再次安静下来时,楚轩本已略微肿胀的脸终于是彻底肿胀了起来,而看他的样子依然是一丁点疼痛都感觉不到,依然悠闲的吃着黄瓜。\n“……总之,事情已经发展到了这一步……”郑吒摸着额头,很痛苦的说道:“那么我们必须先要得到王女鄂亚的关押位置所在,当然了,直接打进首都也其实也不是不可以,但是从原剧情来看,似乎只有那个名叫邪影者的魔法师与国王歌巴塔尔知道她的关押位置,为了不打草惊蛇,还是花些时间慢慢渗透进去查找线索吧。”\n“赵樱空,你负责潜行进王城内,能不杀人尽量不杀人,若是遇到危险就转换人格,或者联络我们……伊莫顿,你负责到王城外的村庄去打探消息,尽量不要引人注意,若是可以,使用你的法术控制些人也行,照刚才我说的,遇到危险可以杀人,也可以联络我们,楚轩,王侠,你们两人就在附近待机,一旦发现有任何情况都要提供支援。”",
"output": "众人都点着头,王侠忽然好奇的问道:“你呢?你打算干什么去?”\n“我会把整个王城的情况给摸清楚,关键是王城里可能隐藏着的力量,我很在意啊,伊莫顿的感觉从某些方面来讲比我们都更出色,毕竟他是死亡的生物……或许,这场恐怖片世界并不像我们所想的那样轻松呢。”郑吒说到这里时叹了口气,他接着又看向了那首都位置。\n“时间还有两天多,我们尽量在三天内完成这部恐怖片世界吧……希望另一边的幼龙能够赶快成长。”郑吒边说话,边驾驶绿魔滑板就向地面飞去,渐渐的,他离众人已经越来越远了。\n此刻,在离王城首都极遥远外的小村处,主角伊拉贡正极其狼狈的奔跑在树丛中,跟随在他身边的还有他的舅舅……非常不幸的,逃跑没多久,他的表哥就失散在了这片森林中,或者说是被那些士兵们给抓住了也说不定。\n更加不幸的是,那名中年武士明显已经落败,不然不会多出那么多士兵紧紧追着他们,比起在村庄的时候,士兵的数量又更加的多了,至少有十多名士兵在他们不远处紧紧追赶。\n“你到底偷了什么东西啊!为什么会有这么多士兵来追赶你呢?”伊拉贡的舅舅气喘吁吁的问道,他已经跑得没什么精力去发怒了。\n“……一个龙蛋,不是偷的,这是我从森林里拣来的!”伊拉贡虽然也是跑得筋疲力尽,但他还在坚持着最后的底线,依然不停辩解着。\n“龙蛋?那可是国王的东西啊!而且还是孵化出幼龙的龙蛋!你这个白痴,你这样会害死大家的!”伊拉贡的舅舅一听此话就气急败坏的叫道,但他还是不停向前跑去,不敢有丁点停顿,因为在他们背后不停的追赶着十多名士兵。\n“在那里!看到他们了!他们在那里!”"
}
```
## 字段:
```
instruction: 指令
input: 输入
output: 输出
```
## 使用限制
仅允许将此数据集及使用此数据集生成的衍生物用于研究目的,不得用于商业,以及其他会对社会带来危害的用途。
本数据集不代表任何一方的立场、利益或想法,无关任何团体的任何类型的主张。因使用本数据集带来的任何损害、纠纷,本项目不承担任何责任。
Join group via https://t.me/+JbovpBG6-gBiNDI1 |
open-llm-leaderboard/details_martyn__llama-megamerge-dare-13b | ---
pretty_name: Evaluation run of martyn/llama-megamerge-dare-13b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [martyn/llama-megamerge-dare-13b](https://huggingface.co/martyn/llama-megamerge-dare-13b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_martyn__llama-megamerge-dare-13b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-13T12:51:07.798960](https://huggingface.co/datasets/open-llm-leaderboard/details_martyn__llama-megamerge-dare-13b/blob/main/results_2023-12-13T12-51-07.798960.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5501426946833134,\n\
\ \"acc_stderr\": 0.033816613061950815,\n \"acc_norm\": 0.5548026687947745,\n\
\ \"acc_norm_stderr\": 0.0345287472758958,\n \"mc1\": 0.3243574051407589,\n\
\ \"mc1_stderr\": 0.01638797677964794,\n \"mc2\": 0.4575555939890639,\n\
\ \"mc2_stderr\": 0.015024821972393557\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.575938566552901,\n \"acc_stderr\": 0.014441889627464396,\n\
\ \"acc_norm\": 0.60580204778157,\n \"acc_norm_stderr\": 0.014280522667467325\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6385182234614618,\n\
\ \"acc_stderr\": 0.004794478426382609,\n \"acc_norm\": 0.8300139414459271,\n\
\ \"acc_norm_stderr\": 0.0037485288878381204\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5037037037037037,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.5037037037037037,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5394736842105263,\n \"acc_stderr\": 0.04056242252249034,\n\
\ \"acc_norm\": 0.5394736842105263,\n \"acc_norm_stderr\": 0.04056242252249034\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6113207547169811,\n \"acc_stderr\": 0.030000485448675986,\n\
\ \"acc_norm\": 0.6113207547169811,\n \"acc_norm_stderr\": 0.030000485448675986\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5902777777777778,\n\
\ \"acc_stderr\": 0.04112490974670787,\n \"acc_norm\": 0.5902777777777778,\n\
\ \"acc_norm_stderr\": 0.04112490974670787\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\"\
: 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5144508670520231,\n\
\ \"acc_stderr\": 0.03810871630454764,\n \"acc_norm\": 0.5144508670520231,\n\
\ \"acc_norm_stderr\": 0.03810871630454764\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.045766654032077615,\n\
\ \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.045766654032077615\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4553191489361702,\n \"acc_stderr\": 0.03255525359340355,\n\
\ \"acc_norm\": 0.4553191489361702,\n \"acc_norm_stderr\": 0.03255525359340355\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.35964912280701755,\n\
\ \"acc_stderr\": 0.045144961328736334,\n \"acc_norm\": 0.35964912280701755,\n\
\ \"acc_norm_stderr\": 0.045144961328736334\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.503448275862069,\n \"acc_stderr\": 0.04166567577101579,\n\
\ \"acc_norm\": 0.503448275862069,\n \"acc_norm_stderr\": 0.04166567577101579\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3492063492063492,\n \"acc_stderr\": 0.024552292209342654,\n \"\
acc_norm\": 0.3492063492063492,\n \"acc_norm_stderr\": 0.024552292209342654\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.0404061017820884,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.0404061017820884\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6193548387096774,\n\
\ \"acc_stderr\": 0.027621717832907032,\n \"acc_norm\": 0.6193548387096774,\n\
\ \"acc_norm_stderr\": 0.027621717832907032\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4088669950738916,\n \"acc_stderr\": 0.034590588158832314,\n\
\ \"acc_norm\": 0.4088669950738916,\n \"acc_norm_stderr\": 0.034590588158832314\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6787878787878788,\n \"acc_stderr\": 0.0364620496325381,\n\
\ \"acc_norm\": 0.6787878787878788,\n \"acc_norm_stderr\": 0.0364620496325381\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7424242424242424,\n \"acc_stderr\": 0.03115626951964683,\n \"\
acc_norm\": 0.7424242424242424,\n \"acc_norm_stderr\": 0.03115626951964683\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7927461139896373,\n \"acc_stderr\": 0.02925282329180363,\n\
\ \"acc_norm\": 0.7927461139896373,\n \"acc_norm_stderr\": 0.02925282329180363\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5025641025641026,\n \"acc_stderr\": 0.025350672979412195,\n\
\ \"acc_norm\": 0.5025641025641026,\n \"acc_norm_stderr\": 0.025350672979412195\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2814814814814815,\n \"acc_stderr\": 0.02742001935094528,\n \
\ \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.02742001935094528\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5966386554621849,\n \"acc_stderr\": 0.03186608121408832,\n \
\ \"acc_norm\": 0.5966386554621849,\n \"acc_norm_stderr\": 0.03186608121408832\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389024,\n \"\
acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7486238532110092,\n \"acc_stderr\": 0.018599206360287415,\n \"\
acc_norm\": 0.7486238532110092,\n \"acc_norm_stderr\": 0.018599206360287415\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4166666666666667,\n \"acc_stderr\": 0.03362277436608044,\n \"\
acc_norm\": 0.4166666666666667,\n \"acc_norm_stderr\": 0.03362277436608044\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7549019607843137,\n \"acc_stderr\": 0.030190282453501943,\n \"\
acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.030190282453501943\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7468354430379747,\n \"acc_stderr\": 0.028304657943035296,\n \
\ \"acc_norm\": 0.7468354430379747,\n \"acc_norm_stderr\": 0.028304657943035296\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6457399103139013,\n\
\ \"acc_stderr\": 0.032100621541349864,\n \"acc_norm\": 0.6457399103139013,\n\
\ \"acc_norm_stderr\": 0.032100621541349864\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5954198473282443,\n \"acc_stderr\": 0.043046937953806645,\n\
\ \"acc_norm\": 0.5954198473282443,\n \"acc_norm_stderr\": 0.043046937953806645\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6942148760330579,\n \"acc_stderr\": 0.04205953933884122,\n \"\
acc_norm\": 0.6942148760330579,\n \"acc_norm_stderr\": 0.04205953933884122\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6759259259259259,\n\
\ \"acc_stderr\": 0.045245960070300476,\n \"acc_norm\": 0.6759259259259259,\n\
\ \"acc_norm_stderr\": 0.045245960070300476\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6871165644171779,\n \"acc_stderr\": 0.036429145782924055,\n\
\ \"acc_norm\": 0.6871165644171779,\n \"acc_norm_stderr\": 0.036429145782924055\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3482142857142857,\n\
\ \"acc_stderr\": 0.04521829902833586,\n \"acc_norm\": 0.3482142857142857,\n\
\ \"acc_norm_stderr\": 0.04521829902833586\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.04453254836326466,\n\
\ \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.04453254836326466\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7991452991452992,\n\
\ \"acc_stderr\": 0.02624677294689048,\n \"acc_norm\": 0.7991452991452992,\n\
\ \"acc_norm_stderr\": 0.02624677294689048\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465918,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465918\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7305236270753512,\n\
\ \"acc_stderr\": 0.015866243073215075,\n \"acc_norm\": 0.7305236270753512,\n\
\ \"acc_norm_stderr\": 0.015866243073215075\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6329479768786127,\n \"acc_stderr\": 0.025950054337654082,\n\
\ \"acc_norm\": 0.6329479768786127,\n \"acc_norm_stderr\": 0.025950054337654082\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3407821229050279,\n\
\ \"acc_stderr\": 0.01585200244986209,\n \"acc_norm\": 0.3407821229050279,\n\
\ \"acc_norm_stderr\": 0.01585200244986209\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5947712418300654,\n \"acc_stderr\": 0.02811092849280907,\n\
\ \"acc_norm\": 0.5947712418300654,\n \"acc_norm_stderr\": 0.02811092849280907\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6463022508038585,\n\
\ \"acc_stderr\": 0.027155208103200868,\n \"acc_norm\": 0.6463022508038585,\n\
\ \"acc_norm_stderr\": 0.027155208103200868\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6327160493827161,\n \"acc_stderr\": 0.026822801759507894,\n\
\ \"acc_norm\": 0.6327160493827161,\n \"acc_norm_stderr\": 0.026822801759507894\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.43617021276595747,\n \"acc_stderr\": 0.029583452036284066,\n \
\ \"acc_norm\": 0.43617021276595747,\n \"acc_norm_stderr\": 0.029583452036284066\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4178617992177314,\n\
\ \"acc_stderr\": 0.01259674410899856,\n \"acc_norm\": 0.4178617992177314,\n\
\ \"acc_norm_stderr\": 0.01259674410899856\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.47058823529411764,\n \"acc_stderr\": 0.030320243265004137,\n\
\ \"acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.030320243265004137\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5375816993464052,\n \"acc_stderr\": 0.02017061497496976,\n \
\ \"acc_norm\": 0.5375816993464052,\n \"acc_norm_stderr\": 0.02017061497496976\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n\
\ \"acc_stderr\": 0.046534298079135075,\n \"acc_norm\": 0.6181818181818182,\n\
\ \"acc_norm_stderr\": 0.046534298079135075\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6081632653061224,\n \"acc_stderr\": 0.031251275910891656,\n\
\ \"acc_norm\": 0.6081632653061224,\n \"acc_norm_stderr\": 0.031251275910891656\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7014925373134329,\n\
\ \"acc_stderr\": 0.03235743789355042,\n \"acc_norm\": 0.7014925373134329,\n\
\ \"acc_norm_stderr\": 0.03235743789355042\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.463855421686747,\n\
\ \"acc_stderr\": 0.03882310850890593,\n \"acc_norm\": 0.463855421686747,\n\
\ \"acc_norm_stderr\": 0.03882310850890593\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7602339181286549,\n \"acc_stderr\": 0.03274485211946956,\n\
\ \"acc_norm\": 0.7602339181286549,\n \"acc_norm_stderr\": 0.03274485211946956\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3243574051407589,\n\
\ \"mc1_stderr\": 0.01638797677964794,\n \"mc2\": 0.4575555939890639,\n\
\ \"mc2_stderr\": 0.015024821972393557\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7616416732438832,\n \"acc_stderr\": 0.011974948667702304\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2850644427596664,\n \
\ \"acc_stderr\": 0.012435042334904004\n }\n}\n```"
repo_url: https://huggingface.co/martyn/llama-megamerge-dare-13b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_13T12_51_07.798960
path:
- '**/details_harness|arc:challenge|25_2023-12-13T12-51-07.798960.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-13T12-51-07.798960.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_13T12_51_07.798960
path:
- '**/details_harness|gsm8k|5_2023-12-13T12-51-07.798960.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-13T12-51-07.798960.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_13T12_51_07.798960
path:
- '**/details_harness|hellaswag|10_2023-12-13T12-51-07.798960.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-13T12-51-07.798960.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_13T12_51_07.798960
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-13T12-51-07.798960.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-13T12-51-07.798960.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-13T12-51-07.798960.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_13T12_51_07.798960
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T12-51-07.798960.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T12-51-07.798960.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_13T12_51_07.798960
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-13T12-51-07.798960.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-13T12-51-07.798960.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_13T12_51_07.798960
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-13T12-51-07.798960.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-13T12-51-07.798960.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_13T12_51_07.798960
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T12-51-07.798960.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T12-51-07.798960.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_13T12_51_07.798960
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T12-51-07.798960.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T12-51-07.798960.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_13T12_51_07.798960
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-13T12-51-07.798960.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-13T12-51-07.798960.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_13T12_51_07.798960
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T12-51-07.798960.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T12-51-07.798960.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_13T12_51_07.798960
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T12-51-07.798960.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T12-51-07.798960.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_13T12_51_07.798960
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T12-51-07.798960.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T12-51-07.798960.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_13T12_51_07.798960
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T12-51-07.798960.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T12-51-07.798960.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_13T12_51_07.798960
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-13T12-51-07.798960.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-13T12-51-07.798960.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_13T12_51_07.798960
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-13T12-51-07.798960.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-13T12-51-07.798960.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_13T12_51_07.798960
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T12-51-07.798960.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T12-51-07.798960.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_13T12_51_07.798960
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-13T12-51-07.798960.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-13T12-51-07.798960.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_13T12_51_07.798960
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T12-51-07.798960.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T12-51-07.798960.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_13T12_51_07.798960
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T12-51-07.798960.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T12-51-07.798960.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_13T12_51_07.798960
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T12-51-07.798960.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T12-51-07.798960.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_13T12_51_07.798960
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-13T12-51-07.798960.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-13T12-51-07.798960.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_13T12_51_07.798960
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T12-51-07.798960.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T12-51-07.798960.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_13T12_51_07.798960
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T12-51-07.798960.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T12-51-07.798960.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_13T12_51_07.798960
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T12-51-07.798960.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T12-51-07.798960.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_13T12_51_07.798960
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T12-51-07.798960.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T12-51-07.798960.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_13T12_51_07.798960
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T12-51-07.798960.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T12-51-07.798960.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_13T12_51_07.798960
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T12-51-07.798960.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T12-51-07.798960.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_13T12_51_07.798960
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T12-51-07.798960.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T12-51-07.798960.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_13T12_51_07.798960
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T12-51-07.798960.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T12-51-07.798960.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_13T12_51_07.798960
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T12-51-07.798960.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T12-51-07.798960.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_13T12_51_07.798960
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T12-51-07.798960.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T12-51-07.798960.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_13T12_51_07.798960
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T12-51-07.798960.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T12-51-07.798960.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_13T12_51_07.798960
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T12-51-07.798960.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T12-51-07.798960.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_13T12_51_07.798960
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T12-51-07.798960.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T12-51-07.798960.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_13T12_51_07.798960
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T12-51-07.798960.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T12-51-07.798960.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_13T12_51_07.798960
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-13T12-51-07.798960.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-13T12-51-07.798960.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_13T12_51_07.798960
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T12-51-07.798960.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T12-51-07.798960.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_13T12_51_07.798960
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-13T12-51-07.798960.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-13T12-51-07.798960.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_13T12_51_07.798960
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T12-51-07.798960.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T12-51-07.798960.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_13T12_51_07.798960
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T12-51-07.798960.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T12-51-07.798960.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_13T12_51_07.798960
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T12-51-07.798960.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T12-51-07.798960.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_13T12_51_07.798960
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-13T12-51-07.798960.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-13T12-51-07.798960.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_13T12_51_07.798960
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-13T12-51-07.798960.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-13T12-51-07.798960.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_13T12_51_07.798960
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T12-51-07.798960.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T12-51-07.798960.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_13T12_51_07.798960
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T12-51-07.798960.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T12-51-07.798960.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_13T12_51_07.798960
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T12-51-07.798960.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T12-51-07.798960.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_13T12_51_07.798960
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T12-51-07.798960.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T12-51-07.798960.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_13T12_51_07.798960
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-13T12-51-07.798960.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-13T12-51-07.798960.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_13T12_51_07.798960
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-13T12-51-07.798960.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-13T12-51-07.798960.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_13T12_51_07.798960
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-13T12-51-07.798960.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-13T12-51-07.798960.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_13T12_51_07.798960
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T12-51-07.798960.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T12-51-07.798960.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_13T12_51_07.798960
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-13T12-51-07.798960.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-13T12-51-07.798960.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_13T12_51_07.798960
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T12-51-07.798960.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T12-51-07.798960.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_13T12_51_07.798960
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T12-51-07.798960.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T12-51-07.798960.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_13T12_51_07.798960
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-13T12-51-07.798960.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-13T12-51-07.798960.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_13T12_51_07.798960
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-13T12-51-07.798960.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-13T12-51-07.798960.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_13T12_51_07.798960
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-13T12-51-07.798960.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-13T12-51-07.798960.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_13T12_51_07.798960
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T12-51-07.798960.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T12-51-07.798960.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_13T12_51_07.798960
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-13T12-51-07.798960.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-13T12-51-07.798960.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_13T12_51_07.798960
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-13T12-51-07.798960.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-13T12-51-07.798960.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_13T12_51_07.798960
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-13T12-51-07.798960.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-13T12-51-07.798960.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_13T12_51_07.798960
path:
- '**/details_harness|winogrande|5_2023-12-13T12-51-07.798960.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-13T12-51-07.798960.parquet'
- config_name: results
data_files:
- split: 2023_12_13T12_51_07.798960
path:
- results_2023-12-13T12-51-07.798960.parquet
- split: latest
path:
- results_2023-12-13T12-51-07.798960.parquet
---
# Dataset Card for Evaluation run of martyn/llama-megamerge-dare-13b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [martyn/llama-megamerge-dare-13b](https://huggingface.co/martyn/llama-megamerge-dare-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_martyn__llama-megamerge-dare-13b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-13T12:51:07.798960](https://huggingface.co/datasets/open-llm-leaderboard/details_martyn__llama-megamerge-dare-13b/blob/main/results_2023-12-13T12-51-07.798960.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5501426946833134,
"acc_stderr": 0.033816613061950815,
"acc_norm": 0.5548026687947745,
"acc_norm_stderr": 0.0345287472758958,
"mc1": 0.3243574051407589,
"mc1_stderr": 0.01638797677964794,
"mc2": 0.4575555939890639,
"mc2_stderr": 0.015024821972393557
},
"harness|arc:challenge|25": {
"acc": 0.575938566552901,
"acc_stderr": 0.014441889627464396,
"acc_norm": 0.60580204778157,
"acc_norm_stderr": 0.014280522667467325
},
"harness|hellaswag|10": {
"acc": 0.6385182234614618,
"acc_stderr": 0.004794478426382609,
"acc_norm": 0.8300139414459271,
"acc_norm_stderr": 0.0037485288878381204
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5037037037037037,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.5037037037037037,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5394736842105263,
"acc_stderr": 0.04056242252249034,
"acc_norm": 0.5394736842105263,
"acc_norm_stderr": 0.04056242252249034
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6113207547169811,
"acc_stderr": 0.030000485448675986,
"acc_norm": 0.6113207547169811,
"acc_norm_stderr": 0.030000485448675986
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5902777777777778,
"acc_stderr": 0.04112490974670787,
"acc_norm": 0.5902777777777778,
"acc_norm_stderr": 0.04112490974670787
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5144508670520231,
"acc_stderr": 0.03810871630454764,
"acc_norm": 0.5144508670520231,
"acc_norm_stderr": 0.03810871630454764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.045766654032077615,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.045766654032077615
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4553191489361702,
"acc_stderr": 0.03255525359340355,
"acc_norm": 0.4553191489361702,
"acc_norm_stderr": 0.03255525359340355
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.35964912280701755,
"acc_stderr": 0.045144961328736334,
"acc_norm": 0.35964912280701755,
"acc_norm_stderr": 0.045144961328736334
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.503448275862069,
"acc_stderr": 0.04166567577101579,
"acc_norm": 0.503448275862069,
"acc_norm_stderr": 0.04166567577101579
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3492063492063492,
"acc_stderr": 0.024552292209342654,
"acc_norm": 0.3492063492063492,
"acc_norm_stderr": 0.024552292209342654
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.0404061017820884,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.0404061017820884
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6193548387096774,
"acc_stderr": 0.027621717832907032,
"acc_norm": 0.6193548387096774,
"acc_norm_stderr": 0.027621717832907032
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4088669950738916,
"acc_stderr": 0.034590588158832314,
"acc_norm": 0.4088669950738916,
"acc_norm_stderr": 0.034590588158832314
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6787878787878788,
"acc_stderr": 0.0364620496325381,
"acc_norm": 0.6787878787878788,
"acc_norm_stderr": 0.0364620496325381
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7424242424242424,
"acc_stderr": 0.03115626951964683,
"acc_norm": 0.7424242424242424,
"acc_norm_stderr": 0.03115626951964683
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7927461139896373,
"acc_stderr": 0.02925282329180363,
"acc_norm": 0.7927461139896373,
"acc_norm_stderr": 0.02925282329180363
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5025641025641026,
"acc_stderr": 0.025350672979412195,
"acc_norm": 0.5025641025641026,
"acc_norm_stderr": 0.025350672979412195
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.02742001935094528,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.02742001935094528
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5966386554621849,
"acc_stderr": 0.03186608121408832,
"acc_norm": 0.5966386554621849,
"acc_norm_stderr": 0.03186608121408832
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.03684881521389024,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.03684881521389024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7486238532110092,
"acc_stderr": 0.018599206360287415,
"acc_norm": 0.7486238532110092,
"acc_norm_stderr": 0.018599206360287415
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.03362277436608044,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.03362277436608044
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.030190282453501943,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.030190282453501943
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7468354430379747,
"acc_stderr": 0.028304657943035296,
"acc_norm": 0.7468354430379747,
"acc_norm_stderr": 0.028304657943035296
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6457399103139013,
"acc_stderr": 0.032100621541349864,
"acc_norm": 0.6457399103139013,
"acc_norm_stderr": 0.032100621541349864
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5954198473282443,
"acc_stderr": 0.043046937953806645,
"acc_norm": 0.5954198473282443,
"acc_norm_stderr": 0.043046937953806645
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6942148760330579,
"acc_stderr": 0.04205953933884122,
"acc_norm": 0.6942148760330579,
"acc_norm_stderr": 0.04205953933884122
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6759259259259259,
"acc_stderr": 0.045245960070300476,
"acc_norm": 0.6759259259259259,
"acc_norm_stderr": 0.045245960070300476
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6871165644171779,
"acc_stderr": 0.036429145782924055,
"acc_norm": 0.6871165644171779,
"acc_norm_stderr": 0.036429145782924055
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3482142857142857,
"acc_stderr": 0.04521829902833586,
"acc_norm": 0.3482142857142857,
"acc_norm_stderr": 0.04521829902833586
},
"harness|hendrycksTest-management|5": {
"acc": 0.7184466019417476,
"acc_stderr": 0.04453254836326466,
"acc_norm": 0.7184466019417476,
"acc_norm_stderr": 0.04453254836326466
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7991452991452992,
"acc_stderr": 0.02624677294689048,
"acc_norm": 0.7991452991452992,
"acc_norm_stderr": 0.02624677294689048
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465918,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465918
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7305236270753512,
"acc_stderr": 0.015866243073215075,
"acc_norm": 0.7305236270753512,
"acc_norm_stderr": 0.015866243073215075
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6329479768786127,
"acc_stderr": 0.025950054337654082,
"acc_norm": 0.6329479768786127,
"acc_norm_stderr": 0.025950054337654082
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3407821229050279,
"acc_stderr": 0.01585200244986209,
"acc_norm": 0.3407821229050279,
"acc_norm_stderr": 0.01585200244986209
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5947712418300654,
"acc_stderr": 0.02811092849280907,
"acc_norm": 0.5947712418300654,
"acc_norm_stderr": 0.02811092849280907
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6463022508038585,
"acc_stderr": 0.027155208103200868,
"acc_norm": 0.6463022508038585,
"acc_norm_stderr": 0.027155208103200868
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6327160493827161,
"acc_stderr": 0.026822801759507894,
"acc_norm": 0.6327160493827161,
"acc_norm_stderr": 0.026822801759507894
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.43617021276595747,
"acc_stderr": 0.029583452036284066,
"acc_norm": 0.43617021276595747,
"acc_norm_stderr": 0.029583452036284066
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4178617992177314,
"acc_stderr": 0.01259674410899856,
"acc_norm": 0.4178617992177314,
"acc_norm_stderr": 0.01259674410899856
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.47058823529411764,
"acc_stderr": 0.030320243265004137,
"acc_norm": 0.47058823529411764,
"acc_norm_stderr": 0.030320243265004137
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5375816993464052,
"acc_stderr": 0.02017061497496976,
"acc_norm": 0.5375816993464052,
"acc_norm_stderr": 0.02017061497496976
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.046534298079135075,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.046534298079135075
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6081632653061224,
"acc_stderr": 0.031251275910891656,
"acc_norm": 0.6081632653061224,
"acc_norm_stderr": 0.031251275910891656
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7014925373134329,
"acc_stderr": 0.03235743789355042,
"acc_norm": 0.7014925373134329,
"acc_norm_stderr": 0.03235743789355042
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-virology|5": {
"acc": 0.463855421686747,
"acc_stderr": 0.03882310850890593,
"acc_norm": 0.463855421686747,
"acc_norm_stderr": 0.03882310850890593
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7602339181286549,
"acc_stderr": 0.03274485211946956,
"acc_norm": 0.7602339181286549,
"acc_norm_stderr": 0.03274485211946956
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3243574051407589,
"mc1_stderr": 0.01638797677964794,
"mc2": 0.4575555939890639,
"mc2_stderr": 0.015024821972393557
},
"harness|winogrande|5": {
"acc": 0.7616416732438832,
"acc_stderr": 0.011974948667702304
},
"harness|gsm8k|5": {
"acc": 0.2850644427596664,
"acc_stderr": 0.012435042334904004
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Bjanota11/arxiv-rag-papers | ---
dataset_info:
features:
- name: Published
dtype: string
- name: Title
dtype: string
- name: Authors
dtype: string
- name: Summary
dtype: string
- name: entry_id
dtype: string
- name: published_first_time
dtype: string
- name: categories
sequence: string
- name: full_text
dtype: string
- name: full_text_len
dtype: int64
- name: full_text_num_words
dtype: int64
splits:
- name: train
num_bytes: 5942270
num_examples: 100
download_size: 3284707
dataset_size: 5942270
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
liuyanchen1015/MULTI_VALUE_qqp_bare_perfect | ---
dataset_info:
features:
- name: question1
dtype: string
- name: question2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 961051
num_examples: 5271
- name: test
num_bytes: 9399530
num_examples: 51736
- name: train
num_bytes: 8688168
num_examples: 47304
download_size: 12091894
dataset_size: 19048749
---
# Dataset Card for "MULTI_VALUE_qqp_bare_perfect"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kye/all-lucidrain-code-python-tokenized-8192-2 | ---
dataset_info:
features:
- name: python_code
sequence: string
- name: repo_name
sequence: string
- name: file_path
sequence: string
splits:
- name: train
num_bytes: 875787
num_examples: 16
download_size: 2857
dataset_size: 875787
---
# Dataset Card for "all-lucidrain-code-python-tokenized-8192-2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
naphatmanu/index-contemporary-1 | ---
license: mit
---
|
dmayhem93/agieval-sat-en | ---
dataset_info:
features:
- name: query
dtype: string
- name: choices
sequence: string
- name: gold
sequence: int64
splits:
- name: test
num_bytes: 1019350
num_examples: 206
download_size: 265465
dataset_size: 1019350
license: mit
---
# Dataset Card for "agieval-sat-en"
Dataset taken from https://github.com/microsoft/AGIEval and processed as in that repo.
MIT License
Copyright (c) Microsoft Corporation.
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE
@misc{zhong2023agieval,
title={AGIEval: A Human-Centric Benchmark for Evaluating Foundation Models},
author={Wanjun Zhong and Ruixiang Cui and Yiduo Guo and Yaobo Liang and Shuai Lu and Yanlin Wang and Amin Saied and Weizhu Chen and Nan Duan},
year={2023},
eprint={2304.06364},
archivePrefix={arXiv},
primaryClass={cs.CL}
} |
Maeda-miyazaki/test_dataset | ---
license: cc-by-nc-sa-3.0
---
|
Weni/LLM-base-clean | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: input_ids
sequence: int32
- name: token_type_ids
sequence: int8
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 44378631
num_examples: 20042
download_size: 13976411
dataset_size: 44378631
---
# Dataset Card for "LLM-base-clean"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
satendra4u2022/Updated_DPO_dataset | ---
license: mit
---
|
tarekeldeeb/ArabicCorpus2B | ---
license: other
---
```
BUILDING VOCABULARY
Processed 1754541204 tokens.
Counted 5329509 unique words.
Truncating vocabulary at min count 5.
Using vocabulary of size 1539115.
```
---
# Build the Arabic Corpus
#### Dowload Resources
The arabic corpus {1.9B word} consists of the following resources:
- ShamelaLibrary348.7z [link](https://www.quran.tv/ketab/ShamelaLibrary348.7z) {1.15B}
- UN arabic corpus [mirror1](http://lotus.kuee.kyoto-u.ac.jp/~raj/rajwindroot/corpora_downloads/UN_CORPUS/UNv1.0.6way.ar.txt) [mirror2](http://corpus.leeds.ac.uk/bogdan/resources/UN-corpus/6way/UNv1.0.6way.ar.txt) {0.37B}
- AraCorpus.tar.gz [link](http://aracorpus.e3rab.com/argistestsrv.nmsu.edu/AraCorpus.tar.gz) {0.14B}
- Arabic Wikipedia Latest Articles Dump [link](https://dumps.wikimedia.org/arwiki/latest/arwiki-latest-pages-articles.xml.bz2) {0.11B}
- Tashkeela-arabic-diacritized-text-utf8-0.3.zip [link](https://netix.dl.sourceforge.net/project/tashkeela/) {0.07B}
- Arabic Tweets [link](https://github.com/bakrianoo/Datasets) {0.03B}
- watan-2004.7z [link](https://netix.dl.sourceforge.net/project/arabiccorpus/watan-2004corpus/watan-2004.7z) {0.01B}
#### Build Script: https://github.com/tarekeldeeb/GloVe-Arabic/tree/master/arabic_corpus
---
# Download the dataset
Mirror : https://archive.org/details/arabic_corpus
---
license: Waqf v2 (https://github.com/ojuba-org/waqf/tree/master/2.0) |
huggingartists/mnogoznaal | ---
language:
- en
tags:
- huggingartists
- lyrics
---
# Dataset Card for "huggingartists/mnogoznaal"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [How to use](#how-to-use)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [About](#about)
## Dataset Description
- **Homepage:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Repository:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of the generated dataset:** 0.210071 MB
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://images.genius.com/29ca6a878f02979daf772290e6e71f48.1000x1000x1.png')">
</div>
</div>
<a href="https://huggingface.co/huggingartists/mnogoznaal">
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
</a>
<div style="text-align: center; font-size: 16px; font-weight: 800">Mnogoznaal</div>
<a href="https://genius.com/artists/mnogoznaal">
<div style="text-align: center; font-size: 14px;">@mnogoznaal</div>
</a>
</div>
### Dataset Summary
The Lyrics dataset parsed from Genius. This dataset is designed to generate lyrics with HuggingArtists.
Model is available [here](https://huggingface.co/huggingartists/mnogoznaal).
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
en
## How to use
How to load this dataset directly with the datasets library:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/mnogoznaal")
```
## Dataset Structure
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"text": "Look, I was gonna go easy on you\nNot to hurt your feelings\nBut I'm only going to get this one chance\nSomething's wrong, I can feel it..."
}
```
### Data Fields
The data fields are the same among all splits.
- `text`: a `string` feature.
### Data Splits
| train |validation|test|
|------:|---------:|---:|
|88| -| -|
'Train' can be easily divided into 'train' & 'validation' & 'test' with few lines of code:
```python
from datasets import load_dataset, Dataset, DatasetDict
import numpy as np
datasets = load_dataset("huggingartists/mnogoznaal")
train_percentage = 0.9
validation_percentage = 0.07
test_percentage = 0.03
train, validation, test = np.split(datasets['train']['text'], [int(len(datasets['train']['text'])*train_percentage), int(len(datasets['train']['text'])*(train_percentage + validation_percentage))])
datasets = DatasetDict(
{
'train': Dataset.from_dict({'text': list(train)}),
'validation': Dataset.from_dict({'text': list(validation)}),
'test': Dataset.from_dict({'text': list(test)})
}
)
```
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@InProceedings{huggingartists,
author={Aleksey Korshuk}
year=2021
}
```
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
CyberHarem/serie_sousounofrieren | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Serie/ゼーリエ (Sousou no Frieren)
This is the dataset of Serie/ゼーリエ (Sousou no Frieren), containing 116 images and their tags.
The core tags of this character are `long_hair, pointy_ears, blonde_hair, sidelocks`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 116 | 86.89 MiB | [Download](https://huggingface.co/datasets/CyberHarem/serie_sousounofrieren/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 116 | 86.86 MiB | [Download](https://huggingface.co/datasets/CyberHarem/serie_sousounofrieren/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 220 | 154.27 MiB | [Download](https://huggingface.co/datasets/CyberHarem/serie_sousounofrieren/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/serie_sousounofrieren',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 |  |  |  |  |  | 1girl, closed_mouth, earrings, elf, solo, portrait, yellow_eyes, blurry_background, anime_coloring, day, looking_at_viewer, outdoors, smile, ear_piercing |
| 1 | 6 |  |  |  |  |  | 1girl, blurry_background, closed_mouth, profile, solo, upper_body, from_side, bookshelf, smile |
| 2 | 8 |  |  |  |  |  | 1girl, elf, from_side, solo, bare_shoulders, necklace, off_shoulder, profile, white_dress, closed_mouth, upper_body, smile |
| 3 | 6 |  |  |  |  |  | 1girl, backlighting, bare_shoulders, elf, looking_at_viewer, smile, solo, upper_body, yellow_eyes, necklace, closed_mouth |
| 4 | 19 |  |  |  |  |  | 1girl, closed_mouth, elf, solo, jewelry, looking_at_viewer, cleavage, smile, robe, upper_body, hair_tubes |
| 5 | 5 |  |  |  |  |  | 1girl, elf, solo, standing, white_dress, full_body, arms_at_sides, bare_shoulders, necklace, floating_hair, very_long_hair |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | closed_mouth | earrings | elf | solo | portrait | yellow_eyes | blurry_background | anime_coloring | day | looking_at_viewer | outdoors | smile | ear_piercing | profile | upper_body | from_side | bookshelf | bare_shoulders | necklace | off_shoulder | white_dress | backlighting | jewelry | cleavage | robe | hair_tubes | standing | full_body | arms_at_sides | floating_hair | very_long_hair |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:-----------|:------|:-------|:-----------|:--------------|:--------------------|:-----------------|:------|:--------------------|:-----------|:--------|:---------------|:----------|:-------------|:------------|:------------|:-----------------|:-----------|:---------------|:--------------|:---------------|:----------|:-----------|:-------|:-------------|:-----------|:------------|:----------------|:----------------|:-----------------|
| 0 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | | | X | | | X | | | | | X | | X | X | X | X | | | | | | | | | | | | | | |
| 2 | 8 |  |  |  |  |  | X | X | | X | X | | | | | | | | X | | X | X | X | | X | X | X | X | | | | | | | | | | |
| 3 | 6 |  |  |  |  |  | X | X | | X | X | | X | | | | X | | X | | | X | | | X | X | | | X | | | | | | | | | |
| 4 | 19 |  |  |  |  |  | X | X | | X | X | | | | | | X | | X | | | X | | | | | | | | X | X | X | X | | | | | |
| 5 | 5 |  |  |  |  |  | X | | | X | X | | | | | | | | | | | | | | X | X | | X | | | | | | X | X | X | X | X |
|
richardr1126/spider-natsql-context-instruct | ---
language:
- en
license:
- cc-by-4.0
source_datasets:
- spider
tags:
- sql
- spider
- natsql
- text-to-sql
- sql finetune
dataset_info:
features:
- name: db_id
dtype: string
- name: text
dtype: string
---
# Dataset Card for Spider NatSQL Context Instruct
### Dataset Summary
[Spider](https://arxiv.org/abs/1809.08887) is a large-scale complex and cross-domain semantic parsing and text-to-SQL dataset annotated by 11 Yale students
The goal of the Spider challenge is to develop natural language interfaces to cross-domain databases.
This dataset was created to finetune LLMs on the Spider dataset with database context using NatSQL.
### NatSQL
[NatSQL](https://arxiv.org/abs/2109.05153) is an intermediate representation for SQL that simplifies the queries and reduces the mismatch between
natural language and SQL. NatSQL preserves the core functionalities of SQL, but removes some clauses and keywords
that are hard to infer from natural language descriptions. NatSQL also makes schema linking easier by reducing the
number of schema items to predict. NatSQL can be easily converted to executable SQL queries and can improve the
performance of text-to-SQL models.
### Yale Lily Spider Leaderboards
The leaderboard can be seen at https://yale-lily.github.io/spider
### Languages
The text in the dataset is in English.
### Licensing Information
The spider dataset is licensed under
the [CC BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0/legalcode)
### Citation
```
@article{yu2018spider,
title={Spider: A large-scale human-labeled dataset for complex and cross-domain semantic parsing and text-to-sql task},
author={Yu, Tao and Zhang, Rui and Yang, Kai and Yasunaga, Michihiro and Wang, Dongxu and Li, Zifan and Ma, James and Li, Irene and Yao, Qingning and Roman, Shanelle and others},
journal={arXiv preprint arXiv:1809.08887},
year={2018}
}
```
```
@inproceedings{gan-etal-2021-natural-sql,
title = "Natural {SQL}: Making {SQL} Easier to Infer from Natural Language Specifications",
author = "Gan, Yujian and
Chen, Xinyun and
Xie, Jinxia and
Purver, Matthew and
Woodward, John R. and
Drake, John and
Zhang, Qiaofu",
booktitle = "Findings of the Association for Computational Linguistics: EMNLP 2021",
month = nov,
year = "2021",
address = "Punta Cana, Dominican Republic",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2021.findings-emnlp.174",
doi = "10.18653/v1/2021.findings-emnlp.174",
pages = "2030--2042",
}
``` |
TheBritishLibrary/EThOS-PhD-metadata | ---
annotations_creators: []
language:
- en
language_creators: []
license: []
multilinguality:
- monolingual
pretty_name: EThOS PhD metadata
size_categories: []
source_datasets: []
tags: []
task_categories:
- text-classification
- fill-mask
task_ids:
- multi-label-classification
- masked-language-modeling
---
# Dataset Card for EThOS PhD metadata
## Table of Contents
- [Dataset Card for blbooksgenre](#dataset-card-for-EThOS PhD metadata)
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Supervised tasks](#supervised-tasks)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Initial Data Collection and Normalization](#initial-data-collection-and-normalization)
- [Who are the source language producers?](#who-are-the-source-language-producers)
- [Annotations](#annotations)
- [Annotation process](#annotation-process)
- [Who are the annotators?](#who-are-the-annotators)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**: https://bl.iro.bl.uk/concern/datasets/c815b271-09be-4123-8156-405094429198?locale=en
- **Repository:** https://doi.org/10.23636/ybpt-nh33
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
The data in this collection comprises the bibliographic metadata for all UK doctoral theses listed in EThOS, the UK's national thesis service. We estimate the data covers around 98% of all PhDs ever awarded by UK Higher Education institutions, dating back to 1787. Thesis metadata from every PhD-awarding university in the UK is included. You can investigate and re-use this unique collection of UK universities' PhD thesis data to analyse trends in postgraduate research, make connections between researchers, apply large data analysis, improve citation of theses and many more applications.
[More Information Needed]
### Supported Tasks and Leaderboards
[More Information Needed]
#### Supervised tasks
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
[More Information Needed]
### Data Instances
An example data instance:
```python
{'Abstract': ' ',
'Author': 'Loizou, Panos A.',
'Author ISNI': 'https://isni.org/isni/0000000136122593',
'DOI': ' ',
'Date': datetime.datetime(1989, 1, 1, 0, 0),
'EThOS URL': 'https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.232781',
'Funder(s)': ' ',
'IR URL': ' ',
'Institution': 'University of Manchester',
'Institution ISNI': 'https://isni.org/isni/0000000121662407',
'ORCID': ' ',
'Qualification': 'Thesis (Ph.D.)',
'Subject Discipline': 0,
'Supervisor(s)': ' ',
'Title': 'Computation and measurement of turbulent flow through idealized turbine blade passages'}
```
### Data Fields
[More Information Needed]
### Data Splits
This dataset contains a single split `train`.
## Dataset Creation
[More Information Needed]
### Curation Rationale
[More Information Needed]
### Source Data
[More Information Needed]
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
[More Information Needed]
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
[More Information Needed]
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
The books are licensed under the [CC BY 4.0 Attribution](https://creativecommons.org/licenses/by/4.0/) license.
### Citation Information
|
open-llm-leaderboard/details_yyjjtt__test-model | ---
pretty_name: Evaluation run of yyjjtt/test-model
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [yyjjtt/test-model](https://huggingface.co/yyjjtt/test-model) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yyjjtt__test-model\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-09T17:29:33.707881](https://huggingface.co/datasets/open-llm-leaderboard/details_yyjjtt__test-model/blob/main/results_2023-12-09T17-29-33.707881.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2582521615185269,\n\
\ \"acc_stderr\": 0.030847868754913528,\n \"acc_norm\": 0.2593091182470286,\n\
\ \"acc_norm_stderr\": 0.03166965639566685,\n \"mc1\": 0.26193390452876375,\n\
\ \"mc1_stderr\": 0.015392118805015025,\n \"mc2\": 0.44593057174416123,\n\
\ \"mc2_stderr\": 0.015586502428911173\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.20392491467576793,\n \"acc_stderr\": 0.011774262478702247,\n\
\ \"acc_norm\": 0.2440273037542662,\n \"acc_norm_stderr\": 0.012551447627856262\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2876916948814977,\n\
\ \"acc_stderr\": 0.004517614647703248,\n \"acc_norm\": 0.30173272256522604,\n\
\ \"acc_norm_stderr\": 0.0045807181159925135\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768081,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768081\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3037037037037037,\n\
\ \"acc_stderr\": 0.039725528847851375,\n \"acc_norm\": 0.3037037037037037,\n\
\ \"acc_norm_stderr\": 0.039725528847851375\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.21710526315789475,\n \"acc_stderr\": 0.03355045304882921,\n\
\ \"acc_norm\": 0.21710526315789475,\n \"acc_norm_stderr\": 0.03355045304882921\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.26,\n\
\ \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.26037735849056604,\n \"acc_stderr\": 0.027008766090708083,\n\
\ \"acc_norm\": 0.26037735849056604,\n \"acc_norm_stderr\": 0.027008766090708083\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2361111111111111,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.2361111111111111,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.16,\n \"acc_stderr\": 0.0368452949177471,\n \"acc_norm\": 0.16,\n\
\ \"acc_norm_stderr\": 0.0368452949177471\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.18497109826589594,\n\
\ \"acc_stderr\": 0.02960562398177122,\n \"acc_norm\": 0.18497109826589594,\n\
\ \"acc_norm_stderr\": 0.02960562398177122\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.18627450980392157,\n \"acc_stderr\": 0.03873958714149351,\n\
\ \"acc_norm\": 0.18627450980392157,\n \"acc_norm_stderr\": 0.03873958714149351\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n\
\ \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.30638297872340425,\n \"acc_stderr\": 0.030135906478517563,\n\
\ \"acc_norm\": 0.30638297872340425,\n \"acc_norm_stderr\": 0.030135906478517563\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.042270544512322,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.042270544512322\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2689655172413793,\n \"acc_stderr\": 0.036951833116502325,\n\
\ \"acc_norm\": 0.2689655172413793,\n \"acc_norm_stderr\": 0.036951833116502325\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"\
acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.21428571428571427,\n\
\ \"acc_stderr\": 0.03670066451047182,\n \"acc_norm\": 0.21428571428571427,\n\
\ \"acc_norm_stderr\": 0.03670066451047182\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2709677419354839,\n\
\ \"acc_stderr\": 0.02528441611490016,\n \"acc_norm\": 0.2709677419354839,\n\
\ \"acc_norm_stderr\": 0.02528441611490016\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.31527093596059114,\n \"acc_stderr\": 0.03269080871970186,\n\
\ \"acc_norm\": 0.31527093596059114,\n \"acc_norm_stderr\": 0.03269080871970186\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2727272727272727,\n \"acc_stderr\": 0.03477691162163659,\n\
\ \"acc_norm\": 0.2727272727272727,\n \"acc_norm_stderr\": 0.03477691162163659\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.23737373737373738,\n \"acc_stderr\": 0.030313710538198913,\n \"\
acc_norm\": 0.23737373737373738,\n \"acc_norm_stderr\": 0.030313710538198913\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.32124352331606215,\n \"acc_stderr\": 0.033699508685490674,\n\
\ \"acc_norm\": 0.32124352331606215,\n \"acc_norm_stderr\": 0.033699508685490674\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.23076923076923078,\n \"acc_stderr\": 0.02136202772522273,\n\
\ \"acc_norm\": 0.23076923076923078,\n \"acc_norm_stderr\": 0.02136202772522273\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26666666666666666,\n \"acc_stderr\": 0.02696242432507384,\n \
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.02696242432507384\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.2184873949579832,\n \"acc_stderr\": 0.026841514322958945,\n\
\ \"acc_norm\": 0.2184873949579832,\n \"acc_norm_stderr\": 0.026841514322958945\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2185430463576159,\n \"acc_stderr\": 0.03374235550425694,\n \"\
acc_norm\": 0.2185430463576159,\n \"acc_norm_stderr\": 0.03374235550425694\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.3192660550458716,\n \"acc_stderr\": 0.01998782906975001,\n \"\
acc_norm\": 0.3192660550458716,\n \"acc_norm_stderr\": 0.01998782906975001\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.24509803921568626,\n\
\ \"acc_stderr\": 0.030190282453501967,\n \"acc_norm\": 0.24509803921568626,\n\
\ \"acc_norm_stderr\": 0.030190282453501967\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.26582278481012656,\n \"acc_stderr\": 0.02875679962965834,\n\
\ \"acc_norm\": 0.26582278481012656,\n \"acc_norm_stderr\": 0.02875679962965834\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3004484304932735,\n\
\ \"acc_stderr\": 0.030769352008229136,\n \"acc_norm\": 0.3004484304932735,\n\
\ \"acc_norm_stderr\": 0.030769352008229136\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.22900763358778625,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.22900763358778625,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.256198347107438,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\"\
: 0.256198347107438,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.19444444444444445,\n\
\ \"acc_stderr\": 0.03826076324884864,\n \"acc_norm\": 0.19444444444444445,\n\
\ \"acc_norm_stderr\": 0.03826076324884864\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.25766871165644173,\n \"acc_stderr\": 0.03436150827846917,\n\
\ \"acc_norm\": 0.25766871165644173,\n \"acc_norm_stderr\": 0.03436150827846917\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.042878587513404544,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.042878587513404544\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.2815533980582524,\n \"acc_stderr\": 0.044532548363264673,\n\
\ \"acc_norm\": 0.2815533980582524,\n \"acc_norm_stderr\": 0.044532548363264673\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.20085470085470086,\n\
\ \"acc_stderr\": 0.02624677294689048,\n \"acc_norm\": 0.20085470085470086,\n\
\ \"acc_norm_stderr\": 0.02624677294689048\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.28735632183908044,\n\
\ \"acc_stderr\": 0.0161824107306827,\n \"acc_norm\": 0.28735632183908044,\n\
\ \"acc_norm_stderr\": 0.0161824107306827\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.23699421965317918,\n \"acc_stderr\": 0.022894082489925992,\n\
\ \"acc_norm\": 0.23699421965317918,\n \"acc_norm_stderr\": 0.022894082489925992\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n\
\ \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n\
\ \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.024288619466046105,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.024288619466046105\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.24437299035369775,\n\
\ \"acc_stderr\": 0.024406162094668907,\n \"acc_norm\": 0.24437299035369775,\n\
\ \"acc_norm_stderr\": 0.024406162094668907\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.02438366553103545,\n\
\ \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.02438366553103545\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.25886524822695034,\n \"acc_stderr\": 0.026129572527180848,\n \
\ \"acc_norm\": 0.25886524822695034,\n \"acc_norm_stderr\": 0.026129572527180848\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.23272490221642764,\n\
\ \"acc_stderr\": 0.010792595553888496,\n \"acc_norm\": 0.23272490221642764,\n\
\ \"acc_norm_stderr\": 0.010792595553888496\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4264705882352941,\n \"acc_stderr\": 0.030042615832714857,\n\
\ \"acc_norm\": 0.4264705882352941,\n \"acc_norm_stderr\": 0.030042615832714857\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.21568627450980393,\n \"acc_stderr\": 0.01663931935031326,\n \
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.01663931935031326\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.19090909090909092,\n\
\ \"acc_stderr\": 0.03764425585984925,\n \"acc_norm\": 0.19090909090909092,\n\
\ \"acc_norm_stderr\": 0.03764425585984925\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.19591836734693877,\n \"acc_stderr\": 0.02540930195322568,\n\
\ \"acc_norm\": 0.19591836734693877,\n \"acc_norm_stderr\": 0.02540930195322568\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23880597014925373,\n\
\ \"acc_stderr\": 0.030147775935409224,\n \"acc_norm\": 0.23880597014925373,\n\
\ \"acc_norm_stderr\": 0.030147775935409224\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3433734939759036,\n\
\ \"acc_stderr\": 0.03696584317010601,\n \"acc_norm\": 0.3433734939759036,\n\
\ \"acc_norm_stderr\": 0.03696584317010601\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.2046783625730994,\n \"acc_stderr\": 0.03094445977853321,\n\
\ \"acc_norm\": 0.2046783625730994,\n \"acc_norm_stderr\": 0.03094445977853321\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.26193390452876375,\n\
\ \"mc1_stderr\": 0.015392118805015025,\n \"mc2\": 0.44593057174416123,\n\
\ \"mc2_stderr\": 0.015586502428911173\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5082872928176796,\n \"acc_stderr\": 0.014050555322824189\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/yyjjtt/test-model
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_09T17_29_33.707881
path:
- '**/details_harness|arc:challenge|25_2023-12-09T17-29-33.707881.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-09T17-29-33.707881.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_09T17_29_33.707881
path:
- '**/details_harness|gsm8k|5_2023-12-09T17-29-33.707881.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-09T17-29-33.707881.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_09T17_29_33.707881
path:
- '**/details_harness|hellaswag|10_2023-12-09T17-29-33.707881.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-09T17-29-33.707881.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_09T17_29_33.707881
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T17-29-33.707881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T17-29-33.707881.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T17-29-33.707881.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_09T17_29_33.707881
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T17-29-33.707881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T17-29-33.707881.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_09T17_29_33.707881
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T17-29-33.707881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T17-29-33.707881.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_09T17_29_33.707881
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T17-29-33.707881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T17-29-33.707881.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_09T17_29_33.707881
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T17-29-33.707881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T17-29-33.707881.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_09T17_29_33.707881
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T17-29-33.707881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T17-29-33.707881.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_09T17_29_33.707881
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T17-29-33.707881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T17-29-33.707881.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_09T17_29_33.707881
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T17-29-33.707881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T17-29-33.707881.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_09T17_29_33.707881
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T17-29-33.707881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T17-29-33.707881.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_09T17_29_33.707881
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T17-29-33.707881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T17-29-33.707881.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_09T17_29_33.707881
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T17-29-33.707881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T17-29-33.707881.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_09T17_29_33.707881
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T17-29-33.707881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T17-29-33.707881.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_09T17_29_33.707881
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T17-29-33.707881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T17-29-33.707881.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_09T17_29_33.707881
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T17-29-33.707881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T17-29-33.707881.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_09T17_29_33.707881
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T17-29-33.707881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T17-29-33.707881.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_09T17_29_33.707881
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T17-29-33.707881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T17-29-33.707881.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_09T17_29_33.707881
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T17-29-33.707881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T17-29-33.707881.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_09T17_29_33.707881
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T17-29-33.707881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T17-29-33.707881.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_09T17_29_33.707881
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T17-29-33.707881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T17-29-33.707881.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_09T17_29_33.707881
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T17-29-33.707881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T17-29-33.707881.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_09T17_29_33.707881
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T17-29-33.707881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T17-29-33.707881.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_09T17_29_33.707881
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T17-29-33.707881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T17-29-33.707881.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_09T17_29_33.707881
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T17-29-33.707881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T17-29-33.707881.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_09T17_29_33.707881
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T17-29-33.707881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T17-29-33.707881.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_09T17_29_33.707881
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T17-29-33.707881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T17-29-33.707881.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_09T17_29_33.707881
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T17-29-33.707881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T17-29-33.707881.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_09T17_29_33.707881
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T17-29-33.707881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T17-29-33.707881.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_09T17_29_33.707881
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T17-29-33.707881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T17-29-33.707881.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_09T17_29_33.707881
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T17-29-33.707881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T17-29-33.707881.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_09T17_29_33.707881
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T17-29-33.707881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T17-29-33.707881.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_09T17_29_33.707881
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T17-29-33.707881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T17-29-33.707881.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_09T17_29_33.707881
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T17-29-33.707881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T17-29-33.707881.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_09T17_29_33.707881
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T17-29-33.707881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T17-29-33.707881.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_09T17_29_33.707881
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T17-29-33.707881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T17-29-33.707881.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_09T17_29_33.707881
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T17-29-33.707881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T17-29-33.707881.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_09T17_29_33.707881
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T17-29-33.707881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T17-29-33.707881.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_09T17_29_33.707881
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T17-29-33.707881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T17-29-33.707881.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_09T17_29_33.707881
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T17-29-33.707881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T17-29-33.707881.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_09T17_29_33.707881
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T17-29-33.707881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T17-29-33.707881.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_09T17_29_33.707881
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-09T17-29-33.707881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-09T17-29-33.707881.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_09T17_29_33.707881
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T17-29-33.707881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T17-29-33.707881.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_09T17_29_33.707881
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T17-29-33.707881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T17-29-33.707881.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_09T17_29_33.707881
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T17-29-33.707881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T17-29-33.707881.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_09T17_29_33.707881
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T17-29-33.707881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T17-29-33.707881.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_09T17_29_33.707881
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T17-29-33.707881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T17-29-33.707881.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_09T17_29_33.707881
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T17-29-33.707881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T17-29-33.707881.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_09T17_29_33.707881
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T17-29-33.707881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T17-29-33.707881.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_09T17_29_33.707881
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T17-29-33.707881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T17-29-33.707881.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_09T17_29_33.707881
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T17-29-33.707881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T17-29-33.707881.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_09T17_29_33.707881
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T17-29-33.707881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T17-29-33.707881.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_09T17_29_33.707881
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T17-29-33.707881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T17-29-33.707881.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_09T17_29_33.707881
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T17-29-33.707881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T17-29-33.707881.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_09T17_29_33.707881
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T17-29-33.707881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T17-29-33.707881.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_09T17_29_33.707881
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T17-29-33.707881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T17-29-33.707881.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_09T17_29_33.707881
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T17-29-33.707881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T17-29-33.707881.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_09T17_29_33.707881
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T17-29-33.707881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T17-29-33.707881.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_09T17_29_33.707881
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T17-29-33.707881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T17-29-33.707881.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_09T17_29_33.707881
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T17-29-33.707881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T17-29-33.707881.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_09T17_29_33.707881
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-09T17-29-33.707881.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-09T17-29-33.707881.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_09T17_29_33.707881
path:
- '**/details_harness|winogrande|5_2023-12-09T17-29-33.707881.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-09T17-29-33.707881.parquet'
- config_name: results
data_files:
- split: 2023_12_09T17_29_33.707881
path:
- results_2023-12-09T17-29-33.707881.parquet
- split: latest
path:
- results_2023-12-09T17-29-33.707881.parquet
---
# Dataset Card for Evaluation run of yyjjtt/test-model
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/yyjjtt/test-model
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [yyjjtt/test-model](https://huggingface.co/yyjjtt/test-model) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yyjjtt__test-model",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-09T17:29:33.707881](https://huggingface.co/datasets/open-llm-leaderboard/details_yyjjtt__test-model/blob/main/results_2023-12-09T17-29-33.707881.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2582521615185269,
"acc_stderr": 0.030847868754913528,
"acc_norm": 0.2593091182470286,
"acc_norm_stderr": 0.03166965639566685,
"mc1": 0.26193390452876375,
"mc1_stderr": 0.015392118805015025,
"mc2": 0.44593057174416123,
"mc2_stderr": 0.015586502428911173
},
"harness|arc:challenge|25": {
"acc": 0.20392491467576793,
"acc_stderr": 0.011774262478702247,
"acc_norm": 0.2440273037542662,
"acc_norm_stderr": 0.012551447627856262
},
"harness|hellaswag|10": {
"acc": 0.2876916948814977,
"acc_stderr": 0.004517614647703248,
"acc_norm": 0.30173272256522604,
"acc_norm_stderr": 0.0045807181159925135
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768081,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768081
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.039725528847851375,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.039725528847851375
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.21710526315789475,
"acc_stderr": 0.03355045304882921,
"acc_norm": 0.21710526315789475,
"acc_norm_stderr": 0.03355045304882921
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.26037735849056604,
"acc_stderr": 0.027008766090708083,
"acc_norm": 0.26037735849056604,
"acc_norm_stderr": 0.027008766090708083
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2361111111111111,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.2361111111111111,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.16,
"acc_stderr": 0.0368452949177471,
"acc_norm": 0.16,
"acc_norm_stderr": 0.0368452949177471
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.18497109826589594,
"acc_stderr": 0.02960562398177122,
"acc_norm": 0.18497109826589594,
"acc_norm_stderr": 0.02960562398177122
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.18627450980392157,
"acc_stderr": 0.03873958714149351,
"acc_norm": 0.18627450980392157,
"acc_norm_stderr": 0.03873958714149351
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.30638297872340425,
"acc_stderr": 0.030135906478517563,
"acc_norm": 0.30638297872340425,
"acc_norm_stderr": 0.030135906478517563
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.042270544512322,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.042270544512322
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2689655172413793,
"acc_stderr": 0.036951833116502325,
"acc_norm": 0.2689655172413793,
"acc_norm_stderr": 0.036951833116502325
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.03670066451047182,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.03670066451047182
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2709677419354839,
"acc_stderr": 0.02528441611490016,
"acc_norm": 0.2709677419354839,
"acc_norm_stderr": 0.02528441611490016
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.31527093596059114,
"acc_stderr": 0.03269080871970186,
"acc_norm": 0.31527093596059114,
"acc_norm_stderr": 0.03269080871970186
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2727272727272727,
"acc_stderr": 0.03477691162163659,
"acc_norm": 0.2727272727272727,
"acc_norm_stderr": 0.03477691162163659
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.23737373737373738,
"acc_stderr": 0.030313710538198913,
"acc_norm": 0.23737373737373738,
"acc_norm_stderr": 0.030313710538198913
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.32124352331606215,
"acc_stderr": 0.033699508685490674,
"acc_norm": 0.32124352331606215,
"acc_norm_stderr": 0.033699508685490674
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.23076923076923078,
"acc_stderr": 0.02136202772522273,
"acc_norm": 0.23076923076923078,
"acc_norm_stderr": 0.02136202772522273
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.02696242432507384,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.02696242432507384
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2184873949579832,
"acc_stderr": 0.026841514322958945,
"acc_norm": 0.2184873949579832,
"acc_norm_stderr": 0.026841514322958945
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2185430463576159,
"acc_stderr": 0.03374235550425694,
"acc_norm": 0.2185430463576159,
"acc_norm_stderr": 0.03374235550425694
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3192660550458716,
"acc_stderr": 0.01998782906975001,
"acc_norm": 0.3192660550458716,
"acc_norm_stderr": 0.01998782906975001
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.030190282453501967,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.030190282453501967
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.26582278481012656,
"acc_stderr": 0.02875679962965834,
"acc_norm": 0.26582278481012656,
"acc_norm_stderr": 0.02875679962965834
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3004484304932735,
"acc_stderr": 0.030769352008229136,
"acc_norm": 0.3004484304932735,
"acc_norm_stderr": 0.030769352008229136
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.22900763358778625,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.22900763358778625,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.256198347107438,
"acc_stderr": 0.03984979653302872,
"acc_norm": 0.256198347107438,
"acc_norm_stderr": 0.03984979653302872
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.19444444444444445,
"acc_stderr": 0.03826076324884864,
"acc_norm": 0.19444444444444445,
"acc_norm_stderr": 0.03826076324884864
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.25766871165644173,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.25766871165644173,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.042878587513404544,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.042878587513404544
},
"harness|hendrycksTest-management|5": {
"acc": 0.2815533980582524,
"acc_stderr": 0.044532548363264673,
"acc_norm": 0.2815533980582524,
"acc_norm_stderr": 0.044532548363264673
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.20085470085470086,
"acc_stderr": 0.02624677294689048,
"acc_norm": 0.20085470085470086,
"acc_norm_stderr": 0.02624677294689048
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.28735632183908044,
"acc_stderr": 0.0161824107306827,
"acc_norm": 0.28735632183908044,
"acc_norm_stderr": 0.0161824107306827
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.23699421965317918,
"acc_stderr": 0.022894082489925992,
"acc_norm": 0.23699421965317918,
"acc_norm_stderr": 0.022894082489925992
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808835,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808835
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.024288619466046105,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.024288619466046105
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.24437299035369775,
"acc_stderr": 0.024406162094668907,
"acc_norm": 0.24437299035369775,
"acc_norm_stderr": 0.024406162094668907
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.02438366553103545,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.02438366553103545
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.25886524822695034,
"acc_stderr": 0.026129572527180848,
"acc_norm": 0.25886524822695034,
"acc_norm_stderr": 0.026129572527180848
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.23272490221642764,
"acc_stderr": 0.010792595553888496,
"acc_norm": 0.23272490221642764,
"acc_norm_stderr": 0.010792595553888496
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4264705882352941,
"acc_stderr": 0.030042615832714857,
"acc_norm": 0.4264705882352941,
"acc_norm_stderr": 0.030042615832714857
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.01663931935031326,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.01663931935031326
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.19090909090909092,
"acc_stderr": 0.03764425585984925,
"acc_norm": 0.19090909090909092,
"acc_norm_stderr": 0.03764425585984925
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.19591836734693877,
"acc_stderr": 0.02540930195322568,
"acc_norm": 0.19591836734693877,
"acc_norm_stderr": 0.02540930195322568
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23880597014925373,
"acc_stderr": 0.030147775935409224,
"acc_norm": 0.23880597014925373,
"acc_norm_stderr": 0.030147775935409224
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3433734939759036,
"acc_stderr": 0.03696584317010601,
"acc_norm": 0.3433734939759036,
"acc_norm_stderr": 0.03696584317010601
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2046783625730994,
"acc_stderr": 0.03094445977853321,
"acc_norm": 0.2046783625730994,
"acc_norm_stderr": 0.03094445977853321
},
"harness|truthfulqa:mc|0": {
"mc1": 0.26193390452876375,
"mc1_stderr": 0.015392118805015025,
"mc2": 0.44593057174416123,
"mc2_stderr": 0.015586502428911173
},
"harness|winogrande|5": {
"acc": 0.5082872928176796,
"acc_stderr": 0.014050555322824189
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
rwitz/filtered_pajama3 | ---
dataset_info:
features:
- name: raw_content
dtype: 'null'
splits:
- name: train
num_bytes: 0
num_examples: 0
download_size: 0
dataset_size: 0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "filtered_pajama3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nairaxo/shimaore-lexicon | ---
dataset_info:
features:
- name: ID
dtype: int64
- name: Word
dtype: string
- name: Word Form
dtype: string
- name: Translation (fr)
dtype: string
- name: Translation (en) (Google)
dtype: string
- name: POS
dtype: string
- name: Polarity
dtype: float64
- name: Sentiment
dtype: string
splits:
- name: train
num_bytes: 192121
num_examples: 2161
download_size: 73500
dataset_size: 192121
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "shimaore-lexicon"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tfshaman/metamath_gsm8k_sympy_v1 | ---
dataset_info:
features:
- name: output
dtype: string
- name: answer
dtype: string
- name: question
dtype: string
- name: code_output
dtype: float64
- name: data_type
dtype: string
splits:
- name: train
num_bytes: 460198126
num_examples: 191710
download_size: 160470899
dataset_size: 460198126
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "metamath_gsm8k_sympy_v1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
JohnnyYu22/CPSC2018_20s | ---
license: other
license_name: other
license_link: LICENSE
---
|
distilled-from-one-sec-cv12/chunk_127 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1191922396
num_examples: 232253
download_size: 1218285876
dataset_size: 1191922396
---
# Dataset Card for "chunk_127"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_RESMPDEV__Gemma-Wukong-2b | ---
pretty_name: Evaluation run of RESMPDEV/Gemma-Wukong-2b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [RESMPDEV/Gemma-Wukong-2b](https://huggingface.co/RESMPDEV/Gemma-Wukong-2b) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_RESMPDEV__Gemma-Wukong-2b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-02T03:03:10.199714](https://huggingface.co/datasets/open-llm-leaderboard/details_RESMPDEV__Gemma-Wukong-2b/blob/main/results_2024-03-02T03-03-10.199714.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3831572150396804,\n\
\ \"acc_stderr\": 0.034077698841351076,\n \"acc_norm\": 0.38708705016212447,\n\
\ \"acc_norm_stderr\": 0.03485851209755768,\n \"mc1\": 0.2778457772337821,\n\
\ \"mc1_stderr\": 0.01568092936402465,\n \"mc2\": 0.4429426283623105,\n\
\ \"mc2_stderr\": 0.014956758030618461\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4249146757679181,\n \"acc_stderr\": 0.014445698968520769,\n\
\ \"acc_norm\": 0.45307167235494883,\n \"acc_norm_stderr\": 0.01454689205200563\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5052778331009758,\n\
\ \"acc_stderr\": 0.004989503417767287,\n \"acc_norm\": 0.6693885680143398,\n\
\ \"acc_norm_stderr\": 0.00469471891822576\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3851851851851852,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.3851851851851852,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3618421052631579,\n \"acc_stderr\": 0.03910525752849724,\n\
\ \"acc_norm\": 0.3618421052631579,\n \"acc_norm_stderr\": 0.03910525752849724\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.36,\n\
\ \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \
\ \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.44150943396226416,\n \"acc_stderr\": 0.030561590426731837,\n\
\ \"acc_norm\": 0.44150943396226416,\n \"acc_norm_stderr\": 0.030561590426731837\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3888888888888889,\n\
\ \"acc_stderr\": 0.04076663253918567,\n \"acc_norm\": 0.3888888888888889,\n\
\ \"acc_norm_stderr\": 0.04076663253918567\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n\
\ \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3179190751445087,\n\
\ \"acc_stderr\": 0.03550683989165582,\n \"acc_norm\": 0.3179190751445087,\n\
\ \"acc_norm_stderr\": 0.03550683989165582\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179961,\n\
\ \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179961\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.37872340425531914,\n \"acc_stderr\": 0.03170995606040655,\n\
\ \"acc_norm\": 0.37872340425531914,\n \"acc_norm_stderr\": 0.03170995606040655\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4206896551724138,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.4206896551724138,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2724867724867725,\n \"acc_stderr\": 0.02293097307163335,\n \"\
acc_norm\": 0.2724867724867725,\n \"acc_norm_stderr\": 0.02293097307163335\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.03718489006818114,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.03718489006818114\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.42258064516129035,\n \"acc_stderr\": 0.02810096472427264,\n \"\
acc_norm\": 0.42258064516129035,\n \"acc_norm_stderr\": 0.02810096472427264\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.33004926108374383,\n \"acc_stderr\": 0.033085304262282574,\n \"\
acc_norm\": 0.33004926108374383,\n \"acc_norm_stderr\": 0.033085304262282574\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\"\
: 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.44242424242424244,\n \"acc_stderr\": 0.03878372113711275,\n\
\ \"acc_norm\": 0.44242424242424244,\n \"acc_norm_stderr\": 0.03878372113711275\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.4090909090909091,\n \"acc_stderr\": 0.03502975799413007,\n \"\
acc_norm\": 0.4090909090909091,\n \"acc_norm_stderr\": 0.03502975799413007\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.45595854922279794,\n \"acc_stderr\": 0.035944137112724366,\n\
\ \"acc_norm\": 0.45595854922279794,\n \"acc_norm_stderr\": 0.035944137112724366\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3153846153846154,\n \"acc_stderr\": 0.02355964698318995,\n \
\ \"acc_norm\": 0.3153846153846154,\n \"acc_norm_stderr\": 0.02355964698318995\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24444444444444444,\n \"acc_stderr\": 0.02620276653465215,\n \
\ \"acc_norm\": 0.24444444444444444,\n \"acc_norm_stderr\": 0.02620276653465215\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.03038835355188684,\n \
\ \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.03038835355188684\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.25165562913907286,\n \"acc_stderr\": 0.035433042343899844,\n \"\
acc_norm\": 0.25165562913907286,\n \"acc_norm_stderr\": 0.035433042343899844\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.5137614678899083,\n \"acc_stderr\": 0.02142920208987408,\n \"\
acc_norm\": 0.5137614678899083,\n \"acc_norm_stderr\": 0.02142920208987408\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.21296296296296297,\n \"acc_stderr\": 0.027920963147993652,\n \"\
acc_norm\": 0.21296296296296297,\n \"acc_norm_stderr\": 0.027920963147993652\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.47058823529411764,\n \"acc_stderr\": 0.03503235296367992,\n \"\
acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.03503235296367992\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.4936708860759494,\n \"acc_stderr\": 0.03254462010767859,\n \
\ \"acc_norm\": 0.4936708860759494,\n \"acc_norm_stderr\": 0.03254462010767859\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4080717488789238,\n\
\ \"acc_stderr\": 0.03298574607842822,\n \"acc_norm\": 0.4080717488789238,\n\
\ \"acc_norm_stderr\": 0.03298574607842822\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.3816793893129771,\n \"acc_stderr\": 0.042607351576445594,\n\
\ \"acc_norm\": 0.3816793893129771,\n \"acc_norm_stderr\": 0.042607351576445594\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5619834710743802,\n \"acc_stderr\": 0.045291468044357915,\n \"\
acc_norm\": 0.5619834710743802,\n \"acc_norm_stderr\": 0.045291468044357915\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4074074074074074,\n\
\ \"acc_stderr\": 0.047500773411999854,\n \"acc_norm\": 0.4074074074074074,\n\
\ \"acc_norm_stderr\": 0.047500773411999854\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3128834355828221,\n \"acc_stderr\": 0.036429145782924055,\n\
\ \"acc_norm\": 0.3128834355828221,\n \"acc_norm_stderr\": 0.036429145782924055\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n\
\ \"acc_stderr\": 0.046355501356099754,\n \"acc_norm\": 0.39285714285714285,\n\
\ \"acc_norm_stderr\": 0.046355501356099754\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.4854368932038835,\n \"acc_stderr\": 0.04948637324026637,\n\
\ \"acc_norm\": 0.4854368932038835,\n \"acc_norm_stderr\": 0.04948637324026637\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5726495726495726,\n\
\ \"acc_stderr\": 0.032408473935163266,\n \"acc_norm\": 0.5726495726495726,\n\
\ \"acc_norm_stderr\": 0.032408473935163266\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237101,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237101\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.49169859514687103,\n\
\ \"acc_stderr\": 0.017877498991072008,\n \"acc_norm\": 0.49169859514687103,\n\
\ \"acc_norm_stderr\": 0.017877498991072008\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.38439306358381503,\n \"acc_stderr\": 0.026189666966272028,\n\
\ \"acc_norm\": 0.38439306358381503,\n \"acc_norm_stderr\": 0.026189666966272028\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23575418994413408,\n\
\ \"acc_stderr\": 0.014196375686290804,\n \"acc_norm\": 0.23575418994413408,\n\
\ \"acc_norm_stderr\": 0.014196375686290804\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.027826109307283686,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.027826109307283686\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.34726688102893893,\n\
\ \"acc_stderr\": 0.027040745502307336,\n \"acc_norm\": 0.34726688102893893,\n\
\ \"acc_norm_stderr\": 0.027040745502307336\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.4382716049382716,\n \"acc_stderr\": 0.027607914087400473,\n\
\ \"acc_norm\": 0.4382716049382716,\n \"acc_norm_stderr\": 0.027607914087400473\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2978723404255319,\n \"acc_stderr\": 0.027281608344469417,\n \
\ \"acc_norm\": 0.2978723404255319,\n \"acc_norm_stderr\": 0.027281608344469417\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.30638852672750977,\n\
\ \"acc_stderr\": 0.011773980329380722,\n \"acc_norm\": 0.30638852672750977,\n\
\ \"acc_norm_stderr\": 0.011773980329380722\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.34558823529411764,\n \"acc_stderr\": 0.028888193103988633,\n\
\ \"acc_norm\": 0.34558823529411764,\n \"acc_norm_stderr\": 0.028888193103988633\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.3888888888888889,\n \"acc_stderr\": 0.019722058939618068,\n \
\ \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.019722058939618068\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04789131426105757,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04789131426105757\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.3836734693877551,\n \"acc_stderr\": 0.031130880396235933,\n\
\ \"acc_norm\": 0.3836734693877551,\n \"acc_norm_stderr\": 0.031130880396235933\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.43781094527363185,\n\
\ \"acc_stderr\": 0.035080801121998406,\n \"acc_norm\": 0.43781094527363185,\n\
\ \"acc_norm_stderr\": 0.035080801121998406\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.40963855421686746,\n\
\ \"acc_stderr\": 0.03828401115079022,\n \"acc_norm\": 0.40963855421686746,\n\
\ \"acc_norm_stderr\": 0.03828401115079022\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.4619883040935672,\n \"acc_stderr\": 0.038237270928823064,\n\
\ \"acc_norm\": 0.4619883040935672,\n \"acc_norm_stderr\": 0.038237270928823064\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2778457772337821,\n\
\ \"mc1_stderr\": 0.01568092936402465,\n \"mc2\": 0.4429426283623105,\n\
\ \"mc2_stderr\": 0.014956758030618461\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6258879242304657,\n \"acc_stderr\": 0.013599792958329826\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.10083396512509477,\n \
\ \"acc_stderr\": 0.008294031192126594\n }\n}\n```"
repo_url: https://huggingface.co/RESMPDEV/Gemma-Wukong-2b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_29T18_41_43.707476
path:
- '**/details_harness|arc:challenge|25_2024-02-29T18-41-43.707476.parquet'
- split: 2024_03_02T03_03_10.199714
path:
- '**/details_harness|arc:challenge|25_2024-03-02T03-03-10.199714.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-02T03-03-10.199714.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_29T18_41_43.707476
path:
- '**/details_harness|gsm8k|5_2024-02-29T18-41-43.707476.parquet'
- split: 2024_03_02T03_03_10.199714
path:
- '**/details_harness|gsm8k|5_2024-03-02T03-03-10.199714.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-02T03-03-10.199714.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_29T18_41_43.707476
path:
- '**/details_harness|hellaswag|10_2024-02-29T18-41-43.707476.parquet'
- split: 2024_03_02T03_03_10.199714
path:
- '**/details_harness|hellaswag|10_2024-03-02T03-03-10.199714.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-02T03-03-10.199714.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_29T18_41_43.707476
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T18-41-43.707476.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T18-41-43.707476.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T18-41-43.707476.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T18-41-43.707476.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T18-41-43.707476.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T18-41-43.707476.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T18-41-43.707476.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T18-41-43.707476.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T18-41-43.707476.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T18-41-43.707476.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T18-41-43.707476.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T18-41-43.707476.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T18-41-43.707476.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T18-41-43.707476.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T18-41-43.707476.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T18-41-43.707476.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T18-41-43.707476.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T18-41-43.707476.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T18-41-43.707476.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T18-41-43.707476.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T18-41-43.707476.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T18-41-43.707476.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T18-41-43.707476.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T18-41-43.707476.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T18-41-43.707476.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T18-41-43.707476.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T18-41-43.707476.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T18-41-43.707476.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T18-41-43.707476.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T18-41-43.707476.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T18-41-43.707476.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T18-41-43.707476.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T18-41-43.707476.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T18-41-43.707476.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T18-41-43.707476.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T18-41-43.707476.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T18-41-43.707476.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T18-41-43.707476.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-29T18-41-43.707476.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T18-41-43.707476.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T18-41-43.707476.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T18-41-43.707476.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T18-41-43.707476.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T18-41-43.707476.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T18-41-43.707476.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T18-41-43.707476.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T18-41-43.707476.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T18-41-43.707476.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T18-41-43.707476.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T18-41-43.707476.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T18-41-43.707476.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T18-41-43.707476.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T18-41-43.707476.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T18-41-43.707476.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T18-41-43.707476.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T18-41-43.707476.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T18-41-43.707476.parquet'
- split: 2024_03_02T03_03_10.199714
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-02T03-03-10.199714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-02T03-03-10.199714.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-02T03-03-10.199714.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_29T18_41_43.707476
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T18-41-43.707476.parquet'
- split: 2024_03_02T03_03_10.199714
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-02T03-03-10.199714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-02T03-03-10.199714.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_29T18_41_43.707476
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T18-41-43.707476.parquet'
- split: 2024_03_02T03_03_10.199714
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-02T03-03-10.199714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-02T03-03-10.199714.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_29T18_41_43.707476
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T18-41-43.707476.parquet'
- split: 2024_03_02T03_03_10.199714
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-02T03-03-10.199714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-02T03-03-10.199714.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_29T18_41_43.707476
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T18-41-43.707476.parquet'
- split: 2024_03_02T03_03_10.199714
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-02T03-03-10.199714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-02T03-03-10.199714.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_29T18_41_43.707476
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T18-41-43.707476.parquet'
- split: 2024_03_02T03_03_10.199714
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-02T03-03-10.199714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-02T03-03-10.199714.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_29T18_41_43.707476
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T18-41-43.707476.parquet'
- split: 2024_03_02T03_03_10.199714
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-02T03-03-10.199714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-02T03-03-10.199714.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_29T18_41_43.707476
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T18-41-43.707476.parquet'
- split: 2024_03_02T03_03_10.199714
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-02T03-03-10.199714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-02T03-03-10.199714.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_29T18_41_43.707476
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T18-41-43.707476.parquet'
- split: 2024_03_02T03_03_10.199714
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-02T03-03-10.199714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-02T03-03-10.199714.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_29T18_41_43.707476
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T18-41-43.707476.parquet'
- split: 2024_03_02T03_03_10.199714
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-02T03-03-10.199714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-02T03-03-10.199714.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_29T18_41_43.707476
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T18-41-43.707476.parquet'
- split: 2024_03_02T03_03_10.199714
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-02T03-03-10.199714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-02T03-03-10.199714.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_29T18_41_43.707476
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T18-41-43.707476.parquet'
- split: 2024_03_02T03_03_10.199714
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-02T03-03-10.199714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-02T03-03-10.199714.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_29T18_41_43.707476
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T18-41-43.707476.parquet'
- split: 2024_03_02T03_03_10.199714
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-02T03-03-10.199714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-02T03-03-10.199714.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_29T18_41_43.707476
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T18-41-43.707476.parquet'
- split: 2024_03_02T03_03_10.199714
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-02T03-03-10.199714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-02T03-03-10.199714.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_29T18_41_43.707476
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T18-41-43.707476.parquet'
- split: 2024_03_02T03_03_10.199714
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-02T03-03-10.199714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-02T03-03-10.199714.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_29T18_41_43.707476
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T18-41-43.707476.parquet'
- split: 2024_03_02T03_03_10.199714
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-02T03-03-10.199714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-02T03-03-10.199714.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_29T18_41_43.707476
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T18-41-43.707476.parquet'
- split: 2024_03_02T03_03_10.199714
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-02T03-03-10.199714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-02T03-03-10.199714.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_29T18_41_43.707476
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T18-41-43.707476.parquet'
- split: 2024_03_02T03_03_10.199714
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-02T03-03-10.199714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-02T03-03-10.199714.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_29T18_41_43.707476
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T18-41-43.707476.parquet'
- split: 2024_03_02T03_03_10.199714
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-02T03-03-10.199714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-02T03-03-10.199714.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_29T18_41_43.707476
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T18-41-43.707476.parquet'
- split: 2024_03_02T03_03_10.199714
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-02T03-03-10.199714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-02T03-03-10.199714.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_29T18_41_43.707476
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T18-41-43.707476.parquet'
- split: 2024_03_02T03_03_10.199714
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-02T03-03-10.199714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-02T03-03-10.199714.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_29T18_41_43.707476
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T18-41-43.707476.parquet'
- split: 2024_03_02T03_03_10.199714
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-02T03-03-10.199714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-02T03-03-10.199714.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_29T18_41_43.707476
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T18-41-43.707476.parquet'
- split: 2024_03_02T03_03_10.199714
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-02T03-03-10.199714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-02T03-03-10.199714.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_29T18_41_43.707476
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T18-41-43.707476.parquet'
- split: 2024_03_02T03_03_10.199714
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-02T03-03-10.199714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-02T03-03-10.199714.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_29T18_41_43.707476
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T18-41-43.707476.parquet'
- split: 2024_03_02T03_03_10.199714
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-02T03-03-10.199714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-02T03-03-10.199714.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_29T18_41_43.707476
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T18-41-43.707476.parquet'
- split: 2024_03_02T03_03_10.199714
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-02T03-03-10.199714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-02T03-03-10.199714.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_29T18_41_43.707476
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T18-41-43.707476.parquet'
- split: 2024_03_02T03_03_10.199714
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-02T03-03-10.199714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-02T03-03-10.199714.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_29T18_41_43.707476
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T18-41-43.707476.parquet'
- split: 2024_03_02T03_03_10.199714
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-02T03-03-10.199714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-02T03-03-10.199714.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_29T18_41_43.707476
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T18-41-43.707476.parquet'
- split: 2024_03_02T03_03_10.199714
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-02T03-03-10.199714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-02T03-03-10.199714.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_29T18_41_43.707476
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T18-41-43.707476.parquet'
- split: 2024_03_02T03_03_10.199714
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-02T03-03-10.199714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-02T03-03-10.199714.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_29T18_41_43.707476
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T18-41-43.707476.parquet'
- split: 2024_03_02T03_03_10.199714
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-02T03-03-10.199714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-02T03-03-10.199714.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_29T18_41_43.707476
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T18-41-43.707476.parquet'
- split: 2024_03_02T03_03_10.199714
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-02T03-03-10.199714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-02T03-03-10.199714.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_29T18_41_43.707476
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T18-41-43.707476.parquet'
- split: 2024_03_02T03_03_10.199714
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-02T03-03-10.199714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-02T03-03-10.199714.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_29T18_41_43.707476
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T18-41-43.707476.parquet'
- split: 2024_03_02T03_03_10.199714
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-02T03-03-10.199714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-02T03-03-10.199714.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_29T18_41_43.707476
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T18-41-43.707476.parquet'
- split: 2024_03_02T03_03_10.199714
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-02T03-03-10.199714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-02T03-03-10.199714.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_29T18_41_43.707476
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T18-41-43.707476.parquet'
- split: 2024_03_02T03_03_10.199714
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-02T03-03-10.199714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-02T03-03-10.199714.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_29T18_41_43.707476
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T18-41-43.707476.parquet'
- split: 2024_03_02T03_03_10.199714
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-02T03-03-10.199714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-02T03-03-10.199714.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_29T18_41_43.707476
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T18-41-43.707476.parquet'
- split: 2024_03_02T03_03_10.199714
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-02T03-03-10.199714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-02T03-03-10.199714.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_29T18_41_43.707476
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T18-41-43.707476.parquet'
- split: 2024_03_02T03_03_10.199714
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-02T03-03-10.199714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-02T03-03-10.199714.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_29T18_41_43.707476
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-29T18-41-43.707476.parquet'
- split: 2024_03_02T03_03_10.199714
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-02T03-03-10.199714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-02T03-03-10.199714.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_29T18_41_43.707476
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T18-41-43.707476.parquet'
- split: 2024_03_02T03_03_10.199714
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-02T03-03-10.199714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-02T03-03-10.199714.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_29T18_41_43.707476
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T18-41-43.707476.parquet'
- split: 2024_03_02T03_03_10.199714
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-02T03-03-10.199714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-02T03-03-10.199714.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_29T18_41_43.707476
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T18-41-43.707476.parquet'
- split: 2024_03_02T03_03_10.199714
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-02T03-03-10.199714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-02T03-03-10.199714.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_29T18_41_43.707476
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T18-41-43.707476.parquet'
- split: 2024_03_02T03_03_10.199714
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-02T03-03-10.199714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-02T03-03-10.199714.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_29T18_41_43.707476
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T18-41-43.707476.parquet'
- split: 2024_03_02T03_03_10.199714
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-02T03-03-10.199714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-02T03-03-10.199714.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_29T18_41_43.707476
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T18-41-43.707476.parquet'
- split: 2024_03_02T03_03_10.199714
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-02T03-03-10.199714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-02T03-03-10.199714.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_29T18_41_43.707476
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T18-41-43.707476.parquet'
- split: 2024_03_02T03_03_10.199714
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-02T03-03-10.199714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-02T03-03-10.199714.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_29T18_41_43.707476
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T18-41-43.707476.parquet'
- split: 2024_03_02T03_03_10.199714
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-02T03-03-10.199714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-02T03-03-10.199714.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_29T18_41_43.707476
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T18-41-43.707476.parquet'
- split: 2024_03_02T03_03_10.199714
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-02T03-03-10.199714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-02T03-03-10.199714.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_29T18_41_43.707476
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T18-41-43.707476.parquet'
- split: 2024_03_02T03_03_10.199714
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-02T03-03-10.199714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-02T03-03-10.199714.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_29T18_41_43.707476
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T18-41-43.707476.parquet'
- split: 2024_03_02T03_03_10.199714
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-02T03-03-10.199714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-02T03-03-10.199714.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_29T18_41_43.707476
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T18-41-43.707476.parquet'
- split: 2024_03_02T03_03_10.199714
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-02T03-03-10.199714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-02T03-03-10.199714.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_29T18_41_43.707476
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T18-41-43.707476.parquet'
- split: 2024_03_02T03_03_10.199714
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-02T03-03-10.199714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-02T03-03-10.199714.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_29T18_41_43.707476
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T18-41-43.707476.parquet'
- split: 2024_03_02T03_03_10.199714
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-02T03-03-10.199714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-02T03-03-10.199714.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_29T18_41_43.707476
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T18-41-43.707476.parquet'
- split: 2024_03_02T03_03_10.199714
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-02T03-03-10.199714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-02T03-03-10.199714.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_29T18_41_43.707476
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T18-41-43.707476.parquet'
- split: 2024_03_02T03_03_10.199714
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-02T03-03-10.199714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-02T03-03-10.199714.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_29T18_41_43.707476
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T18-41-43.707476.parquet'
- split: 2024_03_02T03_03_10.199714
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-02T03-03-10.199714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-02T03-03-10.199714.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_29T18_41_43.707476
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T18-41-43.707476.parquet'
- split: 2024_03_02T03_03_10.199714
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-02T03-03-10.199714.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-02T03-03-10.199714.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_29T18_41_43.707476
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-29T18-41-43.707476.parquet'
- split: 2024_03_02T03_03_10.199714
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-02T03-03-10.199714.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-02T03-03-10.199714.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_29T18_41_43.707476
path:
- '**/details_harness|winogrande|5_2024-02-29T18-41-43.707476.parquet'
- split: 2024_03_02T03_03_10.199714
path:
- '**/details_harness|winogrande|5_2024-03-02T03-03-10.199714.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-02T03-03-10.199714.parquet'
- config_name: results
data_files:
- split: 2024_02_29T18_41_43.707476
path:
- results_2024-02-29T18-41-43.707476.parquet
- split: 2024_03_02T03_03_10.199714
path:
- results_2024-03-02T03-03-10.199714.parquet
- split: latest
path:
- results_2024-03-02T03-03-10.199714.parquet
---
# Dataset Card for Evaluation run of RESMPDEV/Gemma-Wukong-2b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [RESMPDEV/Gemma-Wukong-2b](https://huggingface.co/RESMPDEV/Gemma-Wukong-2b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_RESMPDEV__Gemma-Wukong-2b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-02T03:03:10.199714](https://huggingface.co/datasets/open-llm-leaderboard/details_RESMPDEV__Gemma-Wukong-2b/blob/main/results_2024-03-02T03-03-10.199714.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.3831572150396804,
"acc_stderr": 0.034077698841351076,
"acc_norm": 0.38708705016212447,
"acc_norm_stderr": 0.03485851209755768,
"mc1": 0.2778457772337821,
"mc1_stderr": 0.01568092936402465,
"mc2": 0.4429426283623105,
"mc2_stderr": 0.014956758030618461
},
"harness|arc:challenge|25": {
"acc": 0.4249146757679181,
"acc_stderr": 0.014445698968520769,
"acc_norm": 0.45307167235494883,
"acc_norm_stderr": 0.01454689205200563
},
"harness|hellaswag|10": {
"acc": 0.5052778331009758,
"acc_stderr": 0.004989503417767287,
"acc_norm": 0.6693885680143398,
"acc_norm_stderr": 0.00469471891822576
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3851851851851852,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.3851851851851852,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3618421052631579,
"acc_stderr": 0.03910525752849724,
"acc_norm": 0.3618421052631579,
"acc_norm_stderr": 0.03910525752849724
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.44150943396226416,
"acc_stderr": 0.030561590426731837,
"acc_norm": 0.44150943396226416,
"acc_norm_stderr": 0.030561590426731837
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04076663253918567,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04076663253918567
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3179190751445087,
"acc_stderr": 0.03550683989165582,
"acc_norm": 0.3179190751445087,
"acc_norm_stderr": 0.03550683989165582
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179961,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179961
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.37872340425531914,
"acc_stderr": 0.03170995606040655,
"acc_norm": 0.37872340425531914,
"acc_norm_stderr": 0.03170995606040655
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159394,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159394
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4206896551724138,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.4206896551724138,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2724867724867725,
"acc_stderr": 0.02293097307163335,
"acc_norm": 0.2724867724867725,
"acc_norm_stderr": 0.02293097307163335
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.03718489006818114,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.03718489006818114
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.42258064516129035,
"acc_stderr": 0.02810096472427264,
"acc_norm": 0.42258064516129035,
"acc_norm_stderr": 0.02810096472427264
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.33004926108374383,
"acc_stderr": 0.033085304262282574,
"acc_norm": 0.33004926108374383,
"acc_norm_stderr": 0.033085304262282574
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.44242424242424244,
"acc_stderr": 0.03878372113711275,
"acc_norm": 0.44242424242424244,
"acc_norm_stderr": 0.03878372113711275
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.4090909090909091,
"acc_stderr": 0.03502975799413007,
"acc_norm": 0.4090909090909091,
"acc_norm_stderr": 0.03502975799413007
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.45595854922279794,
"acc_stderr": 0.035944137112724366,
"acc_norm": 0.45595854922279794,
"acc_norm_stderr": 0.035944137112724366
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3153846153846154,
"acc_stderr": 0.02355964698318995,
"acc_norm": 0.3153846153846154,
"acc_norm_stderr": 0.02355964698318995
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24444444444444444,
"acc_stderr": 0.02620276653465215,
"acc_norm": 0.24444444444444444,
"acc_norm_stderr": 0.02620276653465215
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.03038835355188684,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.03038835355188684
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.25165562913907286,
"acc_stderr": 0.035433042343899844,
"acc_norm": 0.25165562913907286,
"acc_norm_stderr": 0.035433042343899844
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5137614678899083,
"acc_stderr": 0.02142920208987408,
"acc_norm": 0.5137614678899083,
"acc_norm_stderr": 0.02142920208987408
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.21296296296296297,
"acc_stderr": 0.027920963147993652,
"acc_norm": 0.21296296296296297,
"acc_norm_stderr": 0.027920963147993652
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.47058823529411764,
"acc_stderr": 0.03503235296367992,
"acc_norm": 0.47058823529411764,
"acc_norm_stderr": 0.03503235296367992
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.4936708860759494,
"acc_stderr": 0.03254462010767859,
"acc_norm": 0.4936708860759494,
"acc_norm_stderr": 0.03254462010767859
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4080717488789238,
"acc_stderr": 0.03298574607842822,
"acc_norm": 0.4080717488789238,
"acc_norm_stderr": 0.03298574607842822
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.3816793893129771,
"acc_stderr": 0.042607351576445594,
"acc_norm": 0.3816793893129771,
"acc_norm_stderr": 0.042607351576445594
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5619834710743802,
"acc_stderr": 0.045291468044357915,
"acc_norm": 0.5619834710743802,
"acc_norm_stderr": 0.045291468044357915
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.047500773411999854,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.047500773411999854
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3128834355828221,
"acc_stderr": 0.036429145782924055,
"acc_norm": 0.3128834355828221,
"acc_norm_stderr": 0.036429145782924055
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.39285714285714285,
"acc_stderr": 0.046355501356099754,
"acc_norm": 0.39285714285714285,
"acc_norm_stderr": 0.046355501356099754
},
"harness|hendrycksTest-management|5": {
"acc": 0.4854368932038835,
"acc_stderr": 0.04948637324026637,
"acc_norm": 0.4854368932038835,
"acc_norm_stderr": 0.04948637324026637
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.5726495726495726,
"acc_stderr": 0.032408473935163266,
"acc_norm": 0.5726495726495726,
"acc_norm_stderr": 0.032408473935163266
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237101,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237101
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.49169859514687103,
"acc_stderr": 0.017877498991072008,
"acc_norm": 0.49169859514687103,
"acc_norm_stderr": 0.017877498991072008
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.38439306358381503,
"acc_stderr": 0.026189666966272028,
"acc_norm": 0.38439306358381503,
"acc_norm_stderr": 0.026189666966272028
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23575418994413408,
"acc_stderr": 0.014196375686290804,
"acc_norm": 0.23575418994413408,
"acc_norm_stderr": 0.014196375686290804
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.027826109307283686,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.027826109307283686
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.34726688102893893,
"acc_stderr": 0.027040745502307336,
"acc_norm": 0.34726688102893893,
"acc_norm_stderr": 0.027040745502307336
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4382716049382716,
"acc_stderr": 0.027607914087400473,
"acc_norm": 0.4382716049382716,
"acc_norm_stderr": 0.027607914087400473
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2978723404255319,
"acc_stderr": 0.027281608344469417,
"acc_norm": 0.2978723404255319,
"acc_norm_stderr": 0.027281608344469417
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.30638852672750977,
"acc_stderr": 0.011773980329380722,
"acc_norm": 0.30638852672750977,
"acc_norm_stderr": 0.011773980329380722
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.34558823529411764,
"acc_stderr": 0.028888193103988633,
"acc_norm": 0.34558823529411764,
"acc_norm_stderr": 0.028888193103988633
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.019722058939618068,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.019722058939618068
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5,
"acc_stderr": 0.04789131426105757,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04789131426105757
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3836734693877551,
"acc_stderr": 0.031130880396235933,
"acc_norm": 0.3836734693877551,
"acc_norm_stderr": 0.031130880396235933
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.43781094527363185,
"acc_stderr": 0.035080801121998406,
"acc_norm": 0.43781094527363185,
"acc_norm_stderr": 0.035080801121998406
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-virology|5": {
"acc": 0.40963855421686746,
"acc_stderr": 0.03828401115079022,
"acc_norm": 0.40963855421686746,
"acc_norm_stderr": 0.03828401115079022
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.4619883040935672,
"acc_stderr": 0.038237270928823064,
"acc_norm": 0.4619883040935672,
"acc_norm_stderr": 0.038237270928823064
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2778457772337821,
"mc1_stderr": 0.01568092936402465,
"mc2": 0.4429426283623105,
"mc2_stderr": 0.014956758030618461
},
"harness|winogrande|5": {
"acc": 0.6258879242304657,
"acc_stderr": 0.013599792958329826
},
"harness|gsm8k|5": {
"acc": 0.10083396512509477,
"acc_stderr": 0.008294031192126594
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
JovialValley/syllable_totalMapped0 | ---
dataset_info:
features:
- name: input_values
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 108518696
num_examples: 389
- name: test
num_bytes: 27131260
num_examples: 98
download_size: 136632106
dataset_size: 135649956
---
# Dataset Card for "syllable_totalMapped0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dhruv107/docs_pro_max_Mar_19 | ---
dataset_info:
features:
- name: image
dtype: image
- name: ground_truth
dtype: string
splits:
- name: train
num_bytes: 1517884764.08
num_examples: 1005
- name: validation
num_bytes: 252768380.0
num_examples: 166
- name: test
num_bytes: 13610810.0
num_examples: 12
download_size: 1775304803
dataset_size: 1784263954.08
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
autoevaluate/autoeval-eval-adversarial_qa-adversarialQA-6a5d0b-66069145576 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- adversarial_qa
eval_info:
task: extractive_question_answering
model: Laurie/QA-distilbert
metrics: []
dataset_name: adversarial_qa
dataset_config: adversarialQA
dataset_split: validation
col_mapping:
context: context
question: question
answers-text: answers.text
answers-answer_start: answers.answer_start
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Question Answering
* Model: Laurie/QA-distilbert
* Dataset: adversarial_qa
* Config: adversarialQA
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@jtatman](https://huggingface.co/jtatman) for evaluating this model. |
joey234/mmlu-marketing-rule-neg-prepend | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: neg_prompt
dtype: string
splits:
- name: test
num_bytes: 130779
num_examples: 234
download_size: 73638
dataset_size: 130779
---
# Dataset Card for "mmlu-marketing-rule-neg-prepend"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
adamo1139/AEZAKMI_v3-2 | ---
license: other
license_name: other
license_link: LICENSE
---
|
open-llm-leaderboard/details_timdettmers__guanaco-65b-merged | ---
pretty_name: Evaluation run of timdettmers/guanaco-65b-merged
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [timdettmers/guanaco-65b-merged](https://huggingface.co/timdettmers/guanaco-65b-merged)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_timdettmers__guanaco-65b-merged\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-18T00:17:34.582006](https://huggingface.co/datasets/open-llm-leaderboard/details_timdettmers__guanaco-65b-merged/blob/main/results_2023-08-18T00%3A17%3A34.582006.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.25104389504062485,\n\
\ \"acc_stderr\": 0.030647487837110618,\n \"acc_norm\": 0.2523346329049775,\n\
\ \"acc_norm_stderr\": 0.030669736900925226,\n \"mc1\": 0.24112607099143207,\n\
\ \"mc1_stderr\": 0.014974827279752346,\n \"mc2\": 0.4840947451540454,\n\
\ \"mc2_stderr\": 0.016324348732205056\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.2030716723549488,\n \"acc_stderr\": 0.011755899303705582,\n\
\ \"acc_norm\": 0.27474402730375425,\n \"acc_norm_stderr\": 0.013044617212771227\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2615016928898626,\n\
\ \"acc_stderr\": 0.004385544487143912,\n \"acc_norm\": 0.26598287193786097,\n\
\ \"acc_norm_stderr\": 0.004409521343140112\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.18518518518518517,\n\
\ \"acc_stderr\": 0.03355677216313142,\n \"acc_norm\": 0.18518518518518517,\n\
\ \"acc_norm_stderr\": 0.03355677216313142\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n\
\ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n\
\ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.02528839450289137,\n\
\ \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.02528839450289137\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n\
\ \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n\
\ \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n\
\ \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n\
\ \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"\
acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"\
acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n \"\
acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.032876667586034906,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.032876667586034906\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"\
acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n\
\ \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.20256410256410257,\n \"acc_stderr\": 0.020377660970371372,\n\
\ \"acc_norm\": 0.20256410256410257,\n \"acc_norm_stderr\": 0.020377660970371372\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \
\ \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n\
\ \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"\
acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"\
acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"\
acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8382352941176471,\n \"acc_stderr\": 0.02584501798692692,\n \"\
acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.02584501798692692\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2742616033755274,\n \"acc_stderr\": 0.029041333510598035,\n \
\ \"acc_norm\": 0.2742616033755274,\n \"acc_norm_stderr\": 0.029041333510598035\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n\
\ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2948717948717949,\n\
\ \"acc_stderr\": 0.029872577708891148,\n \"acc_norm\": 0.2948717948717949,\n\
\ \"acc_norm_stderr\": 0.029872577708891148\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23754789272030652,\n\
\ \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.23754789272030652,\n\
\ \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n\
\ \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n\
\ \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n\
\ \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \
\ \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.26792698826597133,\n\
\ \"acc_stderr\": 0.011311347690633881,\n \"acc_norm\": 0.26792698826597133,\n\
\ \"acc_norm_stderr\": 0.011311347690633881\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n\
\ \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n\
\ \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n\
\ \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n\
\ \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n\
\ \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3216374269005848,\n\
\ \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.3216374269005848,\n\
\ \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.24112607099143207,\n \"mc1_stderr\": 0.014974827279752346,\n\
\ \"mc2\": 0.4840947451540454,\n \"mc2_stderr\": 0.016324348732205056\n\
\ }\n}\n```"
repo_url: https://huggingface.co/timdettmers/guanaco-65b-merged
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|arc:challenge|25_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hellaswag|10_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T00:17:34.582006.parquet'
- config_name: results
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- results_2023-08-18T00:17:34.582006.parquet
- split: latest
path:
- results_2023-08-18T00:17:34.582006.parquet
---
# Dataset Card for Evaluation run of timdettmers/guanaco-65b-merged
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/timdettmers/guanaco-65b-merged
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [timdettmers/guanaco-65b-merged](https://huggingface.co/timdettmers/guanaco-65b-merged) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_timdettmers__guanaco-65b-merged",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-18T00:17:34.582006](https://huggingface.co/datasets/open-llm-leaderboard/details_timdettmers__guanaco-65b-merged/blob/main/results_2023-08-18T00%3A17%3A34.582006.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.25104389504062485,
"acc_stderr": 0.030647487837110618,
"acc_norm": 0.2523346329049775,
"acc_norm_stderr": 0.030669736900925226,
"mc1": 0.24112607099143207,
"mc1_stderr": 0.014974827279752346,
"mc2": 0.4840947451540454,
"mc2_stderr": 0.016324348732205056
},
"harness|arc:challenge|25": {
"acc": 0.2030716723549488,
"acc_stderr": 0.011755899303705582,
"acc_norm": 0.27474402730375425,
"acc_norm_stderr": 0.013044617212771227
},
"harness|hellaswag|10": {
"acc": 0.2615016928898626,
"acc_stderr": 0.004385544487143912,
"acc_norm": 0.26598287193786097,
"acc_norm_stderr": 0.004409521343140112
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03355677216313142,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03355677216313142
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21509433962264152,
"acc_stderr": 0.02528839450289137,
"acc_norm": 0.21509433962264152,
"acc_norm_stderr": 0.02528839450289137
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.02094048156533486,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.02094048156533486
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04040610178208841,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04040610178208841
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1774193548387097,
"acc_stderr": 0.02173254068932927,
"acc_norm": 0.1774193548387097,
"acc_norm_stderr": 0.02173254068932927
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15270935960591134,
"acc_stderr": 0.02530890453938063,
"acc_norm": 0.15270935960591134,
"acc_norm_stderr": 0.02530890453938063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.032876667586034906,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.032876667586034906
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860664,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860664
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.20256410256410257,
"acc_stderr": 0.020377660970371372,
"acc_norm": 0.20256410256410257,
"acc_norm_stderr": 0.020377660970371372
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.024882116857655075,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.024882116857655075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1926605504587156,
"acc_stderr": 0.016909276884936094,
"acc_norm": 0.1926605504587156,
"acc_norm_stderr": 0.016909276884936094
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1527777777777778,
"acc_stderr": 0.024536326026134224,
"acc_norm": 0.1527777777777778,
"acc_norm_stderr": 0.024536326026134224
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.02584501798692692,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.02584501798692692
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2742616033755274,
"acc_stderr": 0.029041333510598035,
"acc_norm": 0.2742616033755274,
"acc_norm_stderr": 0.029041333510598035
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.31390134529147984,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.31390134529147984,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2948717948717949,
"acc_stderr": 0.029872577708891148,
"acc_norm": 0.2948717948717949,
"acc_norm_stderr": 0.029872577708891148
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23754789272030652,
"acc_stderr": 0.015218733046150193,
"acc_norm": 0.23754789272030652,
"acc_norm_stderr": 0.015218733046150193
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432417,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432417
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.26792698826597133,
"acc_stderr": 0.011311347690633881,
"acc_norm": 0.26792698826597133,
"acc_norm_stderr": 0.011311347690633881
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.24112607099143207,
"mc1_stderr": 0.014974827279752346,
"mc2": 0.4840947451540454,
"mc2_stderr": 0.016324348732205056
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
JoaoFassina/pokemon_anotated | ---
license: mit
---
|
Rosenberg/IMCS-NER | ---
license: mit
---
|
jiwon65/aihub_general_6000_for_test | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
- name: audio
sequence: float32
splits:
- name: train
num_bytes: 1055536321
num_examples: 6000
download_size: 898095664
dataset_size: 1055536321
---
# Dataset Card for "korean-general-command-voice_6000-12000_samplingRate-16000_for_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
c00k1ez/summarization | ---
dataset_info:
features:
- name: chapter_id
dtype: int64
- name: book_id
dtype: int64
- name: chapter_title
dtype: string
- name: chapter_summary
dtype: string
- name: source
dtype: int64
- name: chapters_text
dtype: string
splits:
- name: train
num_bytes: 25303291
num_examples: 816
- name: validation
num_bytes: 3636465
num_examples: 139
download_size: 14675842
dataset_size: 28939756
---
# Dataset Card for "summarization"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
siva123444/voice | ---
license: openrail
---
|
open-llm-leaderboard/details_NousResearch__Nous-Hermes-13b | ---
pretty_name: Evaluation run of NousResearch/Nous-Hermes-13b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [NousResearch/Nous-Hermes-13b](https://huggingface.co/NousResearch/Nous-Hermes-13b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NousResearch__Nous-Hermes-13b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-19T04:00:41.897332](https://huggingface.co/datasets/open-llm-leaderboard/details_NousResearch__Nous-Hermes-13b/blob/main/results_2023-10-19T04-00-41.897332.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.2930998322147651,\n\
\ \"em_stderr\": 0.00466150847986569,\n \"f1\": 0.37501048657718355,\n\
\ \"f1_stderr\": 0.004576570475121802,\n \"acc\": 0.41817812997218123,\n\
\ \"acc_stderr\": 0.009868526609981134\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.2930998322147651,\n \"em_stderr\": 0.00466150847986569,\n\
\ \"f1\": 0.37501048657718355,\n \"f1_stderr\": 0.004576570475121802\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.08339651250947688,\n \
\ \"acc_stderr\": 0.00761565027710669\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7529597474348856,\n \"acc_stderr\": 0.012121402942855576\n\
\ }\n}\n```"
repo_url: https://huggingface.co/NousResearch/Nous-Hermes-13b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_18T15_33_41.626742
path:
- '**/details_harness|arc:challenge|25_2023-07-18T15:33:41.626742.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-18T15:33:41.626742.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_19T04_00_41.897332
path:
- '**/details_harness|drop|3_2023-10-19T04-00-41.897332.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-19T04-00-41.897332.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_19T04_00_41.897332
path:
- '**/details_harness|gsm8k|5_2023-10-19T04-00-41.897332.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-19T04-00-41.897332.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_18T15_33_41.626742
path:
- '**/details_harness|hellaswag|10_2023-07-18T15:33:41.626742.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-18T15:33:41.626742.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_18T15_33_41.626742
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T15:33:41.626742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T15:33:41.626742.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T15:33:41.626742.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_18T15_33_41.626742
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T15:33:41.626742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T15:33:41.626742.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_18T15_33_41.626742
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T15:33:41.626742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T15:33:41.626742.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_18T15_33_41.626742
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T15:33:41.626742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T15:33:41.626742.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_18T15_33_41.626742
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T15:33:41.626742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T15:33:41.626742.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_18T15_33_41.626742
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T15:33:41.626742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T15:33:41.626742.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_18T15_33_41.626742
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T15:33:41.626742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T15:33:41.626742.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_18T15_33_41.626742
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T15:33:41.626742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T15:33:41.626742.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_18T15_33_41.626742
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T15:33:41.626742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T15:33:41.626742.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_18T15_33_41.626742
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T15:33:41.626742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T15:33:41.626742.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_18T15_33_41.626742
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T15:33:41.626742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T15:33:41.626742.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_18T15_33_41.626742
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T15:33:41.626742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T15:33:41.626742.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_18T15_33_41.626742
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T15:33:41.626742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T15:33:41.626742.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_18T15_33_41.626742
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T15:33:41.626742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T15:33:41.626742.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_18T15_33_41.626742
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T15:33:41.626742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T15:33:41.626742.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_18T15_33_41.626742
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T15:33:41.626742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T15:33:41.626742.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_18T15_33_41.626742
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T15:33:41.626742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T15:33:41.626742.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_18T15_33_41.626742
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T15:33:41.626742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T15:33:41.626742.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_18T15_33_41.626742
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T15:33:41.626742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T15:33:41.626742.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_18T15_33_41.626742
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T15:33:41.626742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T15:33:41.626742.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_18T15_33_41.626742
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T15:33:41.626742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T15:33:41.626742.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_18T15_33_41.626742
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T15:33:41.626742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T15:33:41.626742.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_18T15_33_41.626742
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T15:33:41.626742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T15:33:41.626742.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_18T15_33_41.626742
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T15:33:41.626742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T15:33:41.626742.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_18T15_33_41.626742
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T15:33:41.626742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T15:33:41.626742.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_18T15_33_41.626742
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T15:33:41.626742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T15:33:41.626742.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_18T15_33_41.626742
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T15:33:41.626742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T15:33:41.626742.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_18T15_33_41.626742
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T15:33:41.626742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T15:33:41.626742.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_18T15_33_41.626742
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T15:33:41.626742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T15:33:41.626742.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_18T15_33_41.626742
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T15:33:41.626742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T15:33:41.626742.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_18T15_33_41.626742
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T15:33:41.626742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T15:33:41.626742.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_18T15_33_41.626742
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T15:33:41.626742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T15:33:41.626742.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_18T15_33_41.626742
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T15:33:41.626742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T15:33:41.626742.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_18T15_33_41.626742
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T15:33:41.626742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T15:33:41.626742.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_18T15_33_41.626742
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T15:33:41.626742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T15:33:41.626742.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_18T15_33_41.626742
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T15:33:41.626742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T15:33:41.626742.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_18T15_33_41.626742
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T15:33:41.626742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T15:33:41.626742.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_18T15_33_41.626742
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T15:33:41.626742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T15:33:41.626742.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_18T15_33_41.626742
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T15:33:41.626742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T15:33:41.626742.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_18T15_33_41.626742
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-18T15:33:41.626742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-18T15:33:41.626742.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_18T15_33_41.626742
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T15:33:41.626742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T15:33:41.626742.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_18T15_33_41.626742
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T15:33:41.626742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T15:33:41.626742.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_18T15_33_41.626742
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T15:33:41.626742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T15:33:41.626742.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_18T15_33_41.626742
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T15:33:41.626742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T15:33:41.626742.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_18T15_33_41.626742
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T15:33:41.626742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T15:33:41.626742.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_18T15_33_41.626742
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T15:33:41.626742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T15:33:41.626742.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_18T15_33_41.626742
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T15:33:41.626742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T15:33:41.626742.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_18T15_33_41.626742
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T15:33:41.626742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T15:33:41.626742.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_18T15_33_41.626742
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T15:33:41.626742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T15:33:41.626742.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_18T15_33_41.626742
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T15:33:41.626742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T15:33:41.626742.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_18T15_33_41.626742
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T15:33:41.626742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T15:33:41.626742.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_18T15_33_41.626742
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T15:33:41.626742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T15:33:41.626742.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_18T15_33_41.626742
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T15:33:41.626742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T15:33:41.626742.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_18T15_33_41.626742
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T15:33:41.626742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T15:33:41.626742.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_18T15_33_41.626742
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T15:33:41.626742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T15:33:41.626742.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_18T15_33_41.626742
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T15:33:41.626742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T15:33:41.626742.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_18T15_33_41.626742
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T15:33:41.626742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T15:33:41.626742.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_18T15_33_41.626742
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T15:33:41.626742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T15:33:41.626742.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_18T15_33_41.626742
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-18T15:33:41.626742.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-18T15:33:41.626742.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_19T04_00_41.897332
path:
- '**/details_harness|winogrande|5_2023-10-19T04-00-41.897332.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-19T04-00-41.897332.parquet'
- config_name: results
data_files:
- split: 2023_07_18T15_33_41.626742
path:
- results_2023-07-18T15:33:41.626742.parquet
- split: 2023_10_19T04_00_41.897332
path:
- results_2023-10-19T04-00-41.897332.parquet
- split: latest
path:
- results_2023-10-19T04-00-41.897332.parquet
---
# Dataset Card for Evaluation run of NousResearch/Nous-Hermes-13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/NousResearch/Nous-Hermes-13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [NousResearch/Nous-Hermes-13b](https://huggingface.co/NousResearch/Nous-Hermes-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NousResearch__Nous-Hermes-13b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-19T04:00:41.897332](https://huggingface.co/datasets/open-llm-leaderboard/details_NousResearch__Nous-Hermes-13b/blob/main/results_2023-10-19T04-00-41.897332.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.2930998322147651,
"em_stderr": 0.00466150847986569,
"f1": 0.37501048657718355,
"f1_stderr": 0.004576570475121802,
"acc": 0.41817812997218123,
"acc_stderr": 0.009868526609981134
},
"harness|drop|3": {
"em": 0.2930998322147651,
"em_stderr": 0.00466150847986569,
"f1": 0.37501048657718355,
"f1_stderr": 0.004576570475121802
},
"harness|gsm8k|5": {
"acc": 0.08339651250947688,
"acc_stderr": 0.00761565027710669
},
"harness|winogrande|5": {
"acc": 0.7529597474348856,
"acc_stderr": 0.012121402942855576
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
ovior/twitter_dataset_1713042523 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 2310449
num_examples: 7143
download_size: 1300560
dataset_size: 2310449
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ineoApp/ds_factures_999 | ---
dataset_info:
features:
- name: id
dtype: string
- name: image
dtype: image
- name: bboxes
sequence:
sequence: int64
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': numero facture
'2': Telephone
'3': Email
'4': Site web
'5': RC
'6': CNSS
'7': TP
'8': Mode de paiement
'9': fournisseur
'10': date facture
'11': date limite
'12': montant ht
'13': montant ttc
'14': tva
'15': prix tva
'16': addresse
'17': reference
'18': Devise
'19': ICE fournisseur
'20': IF fournisseur
'21': Condition de paiement
'22': informations
'23': art1 designation
'24': art1 quantite
'25': art1 prix unit
'26': art1 tva
'27': art1 montant ht
'28': art1 Article
'29': art1 taux de remise
'30': art2 designation
'31': art2 quantite
'32': art2 prix unit
'33': art2 tva
'34': art2 montant ht
'35': art2 Article
'36': art2 taux de remise
'37': art3 designation
'38': art3 quantite
'39': art3 prix unit
'40': art3 tva
'41': art3 montant ht
'42': art3 Article
'43': art3 taux de remise
'44': art4 designation
'45': art4 quantite
'46': art4 prix unit
'47': art4 tva
'48': art4 montant ht
'49': art4 Article
'50': art4 taux de remise
'51': art5 designation
'52': art5 quantite
'53': art5 prix unit
'54': art5 tva
'55': art5 montant ht
'56': art5 Article
'57': art5 taux de remise
'58': art6 designation
'59': art6 quantite
'60': art6 prix unit
'61': art6 tva
'62': art6 montant ht
'63': art6 Article
'64': art6 taux de remise
'65': art7 designation
'66': art7 quantite
'67': art7 prix unit
'68': art7 tva
'69': art7 montant ht
'70': art7 Article
'71': art7 taux de remise
'72': art8 designation
'73': art8 quantite
'74': art8 prix unit
'75': art8 tva
'76': art8 montant ht
'77': art8 Article
'78': art8 taux de remise
'79': art9 designation
'80': art9 quantite
'81': art9 prix unit
'82': art9 tva
'83': art9 montant ht
'84': art9 Article
'85': art9 taux de remise
'86': art10 designation
'87': art10 quantite
'88': art10 prix unit
'89': art10 tva
'90': art10 montant ht
'91': art10 Article
'92': art10 taux de remise
'93': art11 designation
'94': art11 quantite
'95': art11 prix unit
'96': art11 tva
'97': art11 montant ht
'98': art11 Article
'99': art11 taux de remise
- name: tokens
sequence: string
splits:
- name: train
num_bytes: 235900852.0
num_examples: 172
- name: test
num_bytes: 58975213.0
num_examples: 43
download_size: 270968837
dataset_size: 294876065.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
unigram/fol-06 | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype: string
- name: proof
dtype: string
- name: premise_tptp
dtype: string
- name: hypothesis_tptp
dtype: string
- name: deberta_pred
dtype: string
- name: deberta_pred_r1_label
dtype: string
- name: deberta_pred_r2_label
dtype: string
- name: deberta_pred_r3_label
dtype: string
- name: deberta_pred_r4_label
dtype: string
- name: deberta_pred_r5_label
dtype: string
- name: deberta_pred_r6_label
dtype: string
splits:
- name: test
num_bytes: 5434738
num_examples: 663
- name: train
num_bytes: 10725933
num_examples: 1350
- name: validation
num_bytes: 4171476
num_examples: 627
download_size: 3640138
dataset_size: 20332147
---
# Dataset Card for "fol-06"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
0x22almostEvil/tatoeba-mt-qna-oa | ---
license: cc-by-2.0
task_categories:
- question-answering
- translation
language:
- en
- ru
- de
- uk
- sv
- sr
- sl
- ro
- pt
- pl
- nl
- it
- hu
- hr
- fr
- es
- da
- cs
- ca
- bg
tags:
- QnA
- tatoeba
- Translation
pretty_name: tatoeba-mt-qna-oa-120k
size_categories:
- 100K<n<1M
---
# Dataset Card for multilingual tatoeba QnA translation with ~120K entries.
### Dataset Summary
Contains Parquet of a list of instructions and translation articles on different languages.
Each row consists of
* INSTRUCTION
* RESPONSE
* SOURCE (tatoeba)
* METADATA (json with language, text length, uuid, langs-pair).
### Original Dataset is avalible here:
* https://huggingface.co/datasets/Helsinki-NLP/tatoeba_mt |
openaccess-ai-collective/60400ba9de135414daa1e434e633cb57 | Invalid username or password. |
slhenty/climate-fever-nli-stsb | ---
license: unknown
viewer: false
---
**==========================================**
**_IN PROGRESS - NOT READY FOR LOADING OR USE_**
**==========================================**
---
# Dataset Card for climate-fever-nli-stsb
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
The CLIMATE-FEVER dataset modified to supply NLI-style (**cf-nli**) features or STSb-style (**cf-stsb**) features that SentenceBERT training scripts can use as drop-in replacements for AllNLI and/or STSb datasets.
There are two **cf-nli** datasets: one derived from only SUPPORTS and REFUTES evidence (**cf-nli**), and one that also derived data from NOT_ENOUGH_INFO evidence based on the annotator votes (**cf-nli-nei**).
The feature style is specified as a named configuration when loading the dataset: cf-nli, cf-nli-nei, or cf-stsb. See usage notes below for `load_dataset` examples.
### Usage
Load the **cf-nli** dataset
```python
# if datasets not already in your environment
!pip install datasets
from datasets import load_dataset
# all splits...
dd = load_dataset('climate-fever-nli-stsb', 'cf-nli')
# ... or specific split (only 'train' is available)
ds_train = load_dataset('climate-fever-nli-stsb', 'cf-nli', split='train')
## ds_train can now be injected into SentenceBERT training scripts at the point
## where individual sentence pairs are aggregated into
## {'claim': {'entailment': set(), 'contradiction': set(), 'neutral': set()}} dicts
## for further processing into training samples
```
Load the **cf-nli-nei** dataset
```python
# if datasets not already in your environment
!pip install datasets
from datasets import load_dataset
# all splits...
dd = load_dataset('climate-fever-nli-stsb', 'cf-nli-nei')
# ... or specific split (only 'train' is available)
ds_train = load_dataset('climate-fever-nli-stsb', 'cf-nli-nei', split='train')
## ds_train can now be injected into SentenceBERT training scripts at the point
## where individual sentence pairs are aggregated into
## {'claim': {'entailment': set(), 'contradiction': set(), 'neutral': set()}} dicts
## for further processing into training samples
```
Load the **cf-stsb** dataset
```python
# if datasets not already in your environment
!pip install datasets
from datasets import load_dataset
# all splits...
dd = load_dataset('climate-fever-nli-stsb', 'cf-stsb')
# ... or specific split ('train', 'dev', 'test' available)
ds_dev = load_dataset('climate-fever-nli-stsb', 'cf-stsb', split='dev')
## ds_dev (or test) can now be injected into SentenceBERT training scripts at the point
## where individual sentence pairs are aggregated into
## a list of dev (or test) samples
```
<!--
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
-->
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
SentenceBERT models are designed for 'Domain Adaptation' and/or 'Fine-tuning' using labeled data in the downstream task domain. As a bi-encoder, the primary objective function is real-valued similarity scoring. Typical training datasets use NLI-style features as input, and STSb-style features as model evaluation during training, and to measure post-hoc, _intrinsic_ STSb performance. Classification tasks typically use a classifier network that accepts SentenceBERT encodings as input, and is trained on class-labeled datasets.
So, to fine-tune a SentenceBERT model in a climate-change domain, a labeled climate change dataset would be ideal. Much like the authors of the CLIMATE-FEVER dataset, we know of no other _labeled_ datasets specific to climate change. And while CLIMATE-FEVER is suitably labeled for classification tasks, it is not ready for similarity tuning in the style of SentenceBERT.
This modified CLIMATE-FEVER dataset attempts to fill that gap by deriving NLI-style features typically used in pre-training and fine-tuning a SentenceBERT model. SentenceBERT also uses STSb-style features to evaluate model performance both during training and after training to gauge _intrinsic_ model performance on STSb.
### Source Data
#### Initial Data Collection and Normalization
see CLIMATE-FEVER
#### Who are the source language producers?
see CLIMATE-FEVER
<!--
### Annotations
-->
### Annotation process
#### **cf-nli**
For each Claim that has both SUPPORTS evidence and REFUTES evidence, create labeled pairs in the style of NLI dataset:
| split | dataset | sentence1 | sentence2 | label |
|---|---|---|---|---|
| {'train', 'test'} | 'climate-fever' | claim | evidence | evidence_label SUPPORTS -> 'entailment', REFUTES -> 'contradiction' |
> Note that by defintion, only claims classified as DISPUTED include both SUPPORTS and REFUTES evidence, so this dataset is limited to a small subset of CLIMATE-FEVER.
### **cf-nli-nei**
This dataset uses the list of annotator 'votes' to cast a NOT_ENOUGH_INFO (NEI) evidence to a SUPPORTS or REFUTES evidence. By doing so, Claims in the SUPPORTS, REFUTES, and NEI classes can be used to generate additional sentence pairs.
| votes | effective evidence_label |
|---|---|
| SUPPORTS > REFUTES | _SUPPORTS_ |
| SUPPORTS < REFUTES | _REFUTES_ |
In addition to all the claims in **cf-nli**, any Claims that have,
* **_at least one_** SUPPORTS or REFUTES evidence, AND
* NEI evidences that can be cast to effective _SUPPORTS_ or _REFUTES_
are included in the datasset.
### **cf-stsb**
For each Claim <-> Evidence pair, create labeled pairs in the style of STSb dataset:
| split | dataset | score | sentence1 | sentence2 |
|---|---|---|---|---|
| {'train', 'dev', 'test'} | 'climate-fever' | cos_sim score | claim | evidence |
This dataset uses 'evidence_label', vote 'entropy', and the list of annotator 'votes' to derive a similarity score for each claim <-> evidence pairing. Similarity score conversion:
> `mean(entropy)` refers to the average entropy within the defined group of evidence
| evidence_label | votes | similarity score |
|---|---|---|
| SUPPORTS | SUPPORTS > 0, REFUTES == 0, NOT_ENOUGH_INFO (NEI) == 0 | 1 |
| | SUPPORTS > 0, REFUTES == 0 | mean(entropy) |
| | SUPPORTS > 0, REFUTES > 0 | 1 - mean(entropy) |
| NEI | SUPPORTS > REFUTES | (1 - mean(entropy)) / 2|
| | SUPPORTS == REFUTES | 0 |
| | SUPPORTS < REFUTES | -(1 - mean(entropy)) / 2 |
| REFUTES | SUPPORTS > 0, REFUTES > 0 | -(1 - mean(entropy)) |
| | SUPPORTS == 0, REFUTES > 0 | -mean(entropy) |
| | SUPPORTS == 0, REFUTES > 0, NEI == 0 | -1 |
The above derivation roughly maps the strength of evidence annotation (REFUTES..NEI..SUPPORTS) to cosine similarity (-1..0..1).
<!--
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
--> |
CyberHarem/carmilla_rezero | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of carmilla (Re:Zero Kara Hajimeru Isekai Seikatsu)
This is the dataset of carmilla (Re:Zero Kara Hajimeru Isekai Seikatsu), containing 43 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
|
LambdaTests/VQAv2Validation_ViT_H_14_A_T_C_Q_benchmarks_partition_global_8_10000000 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: response
dtype: string
splits:
- name: train
num_bytes: 192259
num_examples: 6699
download_size: 123193
dataset_size: 192259
---
# Dataset Card for "VQAv2Validation_ViT_H_14_A_T_C_Q_benchmarks_partition_global_8_10000000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_ozayezerceli__Threnystril-2x7B-moe | ---
pretty_name: Evaluation run of ozayezerceli/Threnystril-2x7B-moe
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ozayezerceli/Threnystril-2x7B-moe](https://huggingface.co/ozayezerceli/Threnystril-2x7B-moe)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ozayezerceli__Threnystril-2x7B-moe\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-29T22:07:40.775337](https://huggingface.co/datasets/open-llm-leaderboard/details_ozayezerceli__Threnystril-2x7B-moe/blob/main/results_2024-03-29T22-07-40.775337.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6340230941969293,\n\
\ \"acc_stderr\": 0.03230426358452076,\n \"acc_norm\": 0.6380606017640912,\n\
\ \"acc_norm_stderr\": 0.032965808642980834,\n \"mc1\": 0.29498164014687883,\n\
\ \"mc1_stderr\": 0.015964400965589664,\n \"mc2\": 0.527055756270787,\n\
\ \"mc2_stderr\": 0.015569108390237995\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.49658703071672355,\n \"acc_stderr\": 0.014611050403244081,\n\
\ \"acc_norm\": 0.5281569965870307,\n \"acc_norm_stderr\": 0.0145882041051022\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4197371041625174,\n\
\ \"acc_stderr\": 0.004925072159723838,\n \"acc_norm\": 0.7336188010356502,\n\
\ \"acc_norm_stderr\": 0.0044116243741767\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700918,\n\
\ \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700918\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\"\
: 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n\
\ \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n\
\ \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.048786087144669955,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.048786087144669955\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5319148936170213,\n \"acc_stderr\": 0.03261936918467382,\n\
\ \"acc_norm\": 0.5319148936170213,\n \"acc_norm_stderr\": 0.03261936918467382\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.047028804320496165,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.047028804320496165\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6068965517241379,\n \"acc_stderr\": 0.0407032901370707,\n\
\ \"acc_norm\": 0.6068965517241379,\n \"acc_norm_stderr\": 0.0407032901370707\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41005291005291006,\n \"acc_stderr\": 0.02533120243894444,\n \"\
acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.02533120243894444\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\
\ \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n\
\ \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7838709677419354,\n \"acc_stderr\": 0.02341529343356852,\n \"\
acc_norm\": 0.7838709677419354,\n \"acc_norm_stderr\": 0.02341529343356852\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"\
acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\"\
: 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.033175059300091805,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.033175059300091805\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8131313131313131,\n \"acc_stderr\": 0.027772533334218967,\n \"\
acc_norm\": 0.8131313131313131,\n \"acc_norm_stderr\": 0.027772533334218967\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919436,\n\
\ \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919436\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n\
\ \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2777777777777778,\n \"acc_stderr\": 0.02730914058823018,\n \
\ \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.02730914058823018\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \
\ \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8293577981651377,\n \"acc_stderr\": 0.016129271025099857,\n \"\
acc_norm\": 0.8293577981651377,\n \"acc_norm_stderr\": 0.016129271025099857\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.8186274509803921,\n \"acc_stderr\": 0.027044621719474086,\n\
\ \"acc_norm\": 0.8186274509803921,\n \"acc_norm_stderr\": 0.027044621719474086\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7890295358649789,\n \"acc_stderr\": 0.02655837250266192,\n \
\ \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.02655837250266192\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596913,\n\
\ \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596913\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n\
\ \"acc_stderr\": 0.02280138253459753,\n \"acc_norm\": 0.8589743589743589,\n\
\ \"acc_norm_stderr\": 0.02280138253459753\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8250319284802043,\n\
\ \"acc_stderr\": 0.013586619219903336,\n \"acc_norm\": 0.8250319284802043,\n\
\ \"acc_norm_stderr\": 0.013586619219903336\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069353,\n\
\ \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069353\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.38100558659217876,\n\
\ \"acc_stderr\": 0.01624202883405361,\n \"acc_norm\": 0.38100558659217876,\n\
\ \"acc_norm_stderr\": 0.01624202883405361\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7124183006535948,\n \"acc_stderr\": 0.02591780611714716,\n\
\ \"acc_norm\": 0.7124183006535948,\n \"acc_norm_stderr\": 0.02591780611714716\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7266881028938906,\n\
\ \"acc_stderr\": 0.02531176597542612,\n \"acc_norm\": 0.7266881028938906,\n\
\ \"acc_norm_stderr\": 0.02531176597542612\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7253086419753086,\n \"acc_stderr\": 0.024836057868294674,\n\
\ \"acc_norm\": 0.7253086419753086,\n \"acc_norm_stderr\": 0.024836057868294674\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4556714471968709,\n\
\ \"acc_stderr\": 0.01271994954303221,\n \"acc_norm\": 0.4556714471968709,\n\
\ \"acc_norm_stderr\": 0.01271994954303221\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6654411764705882,\n \"acc_stderr\": 0.028661996202335303,\n\
\ \"acc_norm\": 0.6654411764705882,\n \"acc_norm_stderr\": 0.028661996202335303\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6552287581699346,\n \"acc_stderr\": 0.019228322018696647,\n \
\ \"acc_norm\": 0.6552287581699346,\n \"acc_norm_stderr\": 0.019228322018696647\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7020408163265306,\n \"acc_stderr\": 0.029279567411065674,\n\
\ \"acc_norm\": 0.7020408163265306,\n \"acc_norm_stderr\": 0.029279567411065674\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8656716417910447,\n\
\ \"acc_stderr\": 0.02411267824090081,\n \"acc_norm\": 0.8656716417910447,\n\
\ \"acc_norm_stderr\": 0.02411267824090081\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.034873508801977704,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.034873508801977704\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n\
\ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n\
\ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.29498164014687883,\n\
\ \"mc1_stderr\": 0.015964400965589664,\n \"mc2\": 0.527055756270787,\n\
\ \"mc2_stderr\": 0.015569108390237995\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7884767166535123,\n \"acc_stderr\": 0.011477747684223183\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5868081880212282,\n \
\ \"acc_stderr\": 0.013563326951984374\n }\n}\n```"
repo_url: https://huggingface.co/ozayezerceli/Threnystril-2x7B-moe
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_29T22_07_40.775337
path:
- '**/details_harness|arc:challenge|25_2024-03-29T22-07-40.775337.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-29T22-07-40.775337.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_29T22_07_40.775337
path:
- '**/details_harness|gsm8k|5_2024-03-29T22-07-40.775337.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-29T22-07-40.775337.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_29T22_07_40.775337
path:
- '**/details_harness|hellaswag|10_2024-03-29T22-07-40.775337.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-29T22-07-40.775337.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_29T22_07_40.775337
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T22-07-40.775337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T22-07-40.775337.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T22-07-40.775337.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_29T22_07_40.775337
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T22-07-40.775337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T22-07-40.775337.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_29T22_07_40.775337
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T22-07-40.775337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T22-07-40.775337.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_29T22_07_40.775337
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T22-07-40.775337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T22-07-40.775337.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_29T22_07_40.775337
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T22-07-40.775337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T22-07-40.775337.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_29T22_07_40.775337
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T22-07-40.775337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T22-07-40.775337.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_29T22_07_40.775337
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T22-07-40.775337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T22-07-40.775337.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_29T22_07_40.775337
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T22-07-40.775337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T22-07-40.775337.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_29T22_07_40.775337
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T22-07-40.775337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T22-07-40.775337.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_29T22_07_40.775337
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T22-07-40.775337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T22-07-40.775337.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_29T22_07_40.775337
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T22-07-40.775337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T22-07-40.775337.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_29T22_07_40.775337
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T22-07-40.775337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T22-07-40.775337.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_29T22_07_40.775337
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T22-07-40.775337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T22-07-40.775337.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_29T22_07_40.775337
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T22-07-40.775337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T22-07-40.775337.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_29T22_07_40.775337
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T22-07-40.775337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T22-07-40.775337.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_29T22_07_40.775337
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T22-07-40.775337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T22-07-40.775337.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_29T22_07_40.775337
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T22-07-40.775337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T22-07-40.775337.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_29T22_07_40.775337
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T22-07-40.775337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T22-07-40.775337.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_29T22_07_40.775337
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T22-07-40.775337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T22-07-40.775337.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_29T22_07_40.775337
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T22-07-40.775337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T22-07-40.775337.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_29T22_07_40.775337
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T22-07-40.775337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T22-07-40.775337.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_29T22_07_40.775337
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T22-07-40.775337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T22-07-40.775337.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_29T22_07_40.775337
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T22-07-40.775337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T22-07-40.775337.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_29T22_07_40.775337
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T22-07-40.775337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T22-07-40.775337.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_29T22_07_40.775337
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T22-07-40.775337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T22-07-40.775337.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_29T22_07_40.775337
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T22-07-40.775337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T22-07-40.775337.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_29T22_07_40.775337
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T22-07-40.775337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T22-07-40.775337.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_29T22_07_40.775337
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T22-07-40.775337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T22-07-40.775337.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_29T22_07_40.775337
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T22-07-40.775337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T22-07-40.775337.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_29T22_07_40.775337
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T22-07-40.775337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T22-07-40.775337.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_29T22_07_40.775337
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T22-07-40.775337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T22-07-40.775337.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_29T22_07_40.775337
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T22-07-40.775337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T22-07-40.775337.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_29T22_07_40.775337
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T22-07-40.775337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T22-07-40.775337.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_29T22_07_40.775337
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T22-07-40.775337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T22-07-40.775337.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_29T22_07_40.775337
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T22-07-40.775337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T22-07-40.775337.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_29T22_07_40.775337
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T22-07-40.775337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T22-07-40.775337.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_29T22_07_40.775337
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T22-07-40.775337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T22-07-40.775337.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_29T22_07_40.775337
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T22-07-40.775337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T22-07-40.775337.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_29T22_07_40.775337
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T22-07-40.775337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T22-07-40.775337.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_29T22_07_40.775337
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-29T22-07-40.775337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-29T22-07-40.775337.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_29T22_07_40.775337
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T22-07-40.775337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T22-07-40.775337.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_29T22_07_40.775337
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T22-07-40.775337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T22-07-40.775337.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_29T22_07_40.775337
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T22-07-40.775337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T22-07-40.775337.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_29T22_07_40.775337
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T22-07-40.775337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T22-07-40.775337.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_29T22_07_40.775337
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T22-07-40.775337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T22-07-40.775337.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_29T22_07_40.775337
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T22-07-40.775337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T22-07-40.775337.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_29T22_07_40.775337
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T22-07-40.775337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T22-07-40.775337.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_29T22_07_40.775337
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T22-07-40.775337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T22-07-40.775337.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_29T22_07_40.775337
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T22-07-40.775337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T22-07-40.775337.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_29T22_07_40.775337
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T22-07-40.775337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T22-07-40.775337.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_29T22_07_40.775337
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T22-07-40.775337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T22-07-40.775337.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_29T22_07_40.775337
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T22-07-40.775337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T22-07-40.775337.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_29T22_07_40.775337
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T22-07-40.775337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T22-07-40.775337.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_29T22_07_40.775337
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T22-07-40.775337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T22-07-40.775337.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_29T22_07_40.775337
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T22-07-40.775337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T22-07-40.775337.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_29T22_07_40.775337
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T22-07-40.775337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T22-07-40.775337.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_29T22_07_40.775337
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T22-07-40.775337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T22-07-40.775337.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_29T22_07_40.775337
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T22-07-40.775337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T22-07-40.775337.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_29T22_07_40.775337
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-29T22-07-40.775337.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-29T22-07-40.775337.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_29T22_07_40.775337
path:
- '**/details_harness|winogrande|5_2024-03-29T22-07-40.775337.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-29T22-07-40.775337.parquet'
- config_name: results
data_files:
- split: 2024_03_29T22_07_40.775337
path:
- results_2024-03-29T22-07-40.775337.parquet
- split: latest
path:
- results_2024-03-29T22-07-40.775337.parquet
---
# Dataset Card for Evaluation run of ozayezerceli/Threnystril-2x7B-moe
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ozayezerceli/Threnystril-2x7B-moe](https://huggingface.co/ozayezerceli/Threnystril-2x7B-moe) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ozayezerceli__Threnystril-2x7B-moe",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-29T22:07:40.775337](https://huggingface.co/datasets/open-llm-leaderboard/details_ozayezerceli__Threnystril-2x7B-moe/blob/main/results_2024-03-29T22-07-40.775337.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6340230941969293,
"acc_stderr": 0.03230426358452076,
"acc_norm": 0.6380606017640912,
"acc_norm_stderr": 0.032965808642980834,
"mc1": 0.29498164014687883,
"mc1_stderr": 0.015964400965589664,
"mc2": 0.527055756270787,
"mc2_stderr": 0.015569108390237995
},
"harness|arc:challenge|25": {
"acc": 0.49658703071672355,
"acc_stderr": 0.014611050403244081,
"acc_norm": 0.5281569965870307,
"acc_norm_stderr": 0.0145882041051022
},
"harness|hellaswag|10": {
"acc": 0.4197371041625174,
"acc_stderr": 0.004925072159723838,
"acc_norm": 0.7336188010356502,
"acc_norm_stderr": 0.0044116243741767
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7169811320754716,
"acc_stderr": 0.027724236492700918,
"acc_norm": 0.7169811320754716,
"acc_norm_stderr": 0.027724236492700918
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.048786087144669955,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.048786087144669955
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5319148936170213,
"acc_stderr": 0.03261936918467382,
"acc_norm": 0.5319148936170213,
"acc_norm_stderr": 0.03261936918467382
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.047028804320496165,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.047028804320496165
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6068965517241379,
"acc_stderr": 0.0407032901370707,
"acc_norm": 0.6068965517241379,
"acc_norm_stderr": 0.0407032901370707
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.02533120243894444,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.02533120243894444
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.02341529343356852,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.02341529343356852
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.033175059300091805,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.033175059300091805
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8131313131313131,
"acc_stderr": 0.027772533334218967,
"acc_norm": 0.8131313131313131,
"acc_norm_stderr": 0.027772533334218967
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919436,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919436
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.02730914058823018,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.02730914058823018
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.03958027231121569,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.03958027231121569
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8293577981651377,
"acc_stderr": 0.016129271025099857,
"acc_norm": 0.8293577981651377,
"acc_norm_stderr": 0.016129271025099857
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8186274509803921,
"acc_stderr": 0.027044621719474086,
"acc_norm": 0.8186274509803921,
"acc_norm_stderr": 0.027044621719474086
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.02655837250266192,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.02655837250266192
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596913,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596913
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.02280138253459753,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.02280138253459753
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8250319284802043,
"acc_stderr": 0.013586619219903336,
"acc_norm": 0.8250319284802043,
"acc_norm_stderr": 0.013586619219903336
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069353,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069353
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.38100558659217876,
"acc_stderr": 0.01624202883405361,
"acc_norm": 0.38100558659217876,
"acc_norm_stderr": 0.01624202883405361
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7124183006535948,
"acc_stderr": 0.02591780611714716,
"acc_norm": 0.7124183006535948,
"acc_norm_stderr": 0.02591780611714716
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7266881028938906,
"acc_stderr": 0.02531176597542612,
"acc_norm": 0.7266881028938906,
"acc_norm_stderr": 0.02531176597542612
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7253086419753086,
"acc_stderr": 0.024836057868294674,
"acc_norm": 0.7253086419753086,
"acc_norm_stderr": 0.024836057868294674
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4556714471968709,
"acc_stderr": 0.01271994954303221,
"acc_norm": 0.4556714471968709,
"acc_norm_stderr": 0.01271994954303221
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6654411764705882,
"acc_stderr": 0.028661996202335303,
"acc_norm": 0.6654411764705882,
"acc_norm_stderr": 0.028661996202335303
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6552287581699346,
"acc_stderr": 0.019228322018696647,
"acc_norm": 0.6552287581699346,
"acc_norm_stderr": 0.019228322018696647
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7020408163265306,
"acc_stderr": 0.029279567411065674,
"acc_norm": 0.7020408163265306,
"acc_norm_stderr": 0.029279567411065674
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8656716417910447,
"acc_stderr": 0.02411267824090081,
"acc_norm": 0.8656716417910447,
"acc_norm_stderr": 0.02411267824090081
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.034873508801977704,
"acc_norm": 0.86,
"acc_norm_stderr": 0.034873508801977704
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.29498164014687883,
"mc1_stderr": 0.015964400965589664,
"mc2": 0.527055756270787,
"mc2_stderr": 0.015569108390237995
},
"harness|winogrande|5": {
"acc": 0.7884767166535123,
"acc_stderr": 0.011477747684223183
},
"harness|gsm8k|5": {
"acc": 0.5868081880212282,
"acc_stderr": 0.013563326951984374
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
AdapterOcean/physics_dataset_standardized_cluster_1_std | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: cluster
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 13270296
num_examples: 8714
download_size: 0
dataset_size: 13270296
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "physics_dataset_standardized_cluster_1_std"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
christinacdl/clickbait_notclickbait_dataset | ---
license: apache-2.0
task_categories:
- text-classification
language:
- en
size_categories:
- 10K<n<100K
---
0 : not clickbait
1 : clickbait
Dataset cleaned from duplicates and kept only the first appearing text.
Dataset split into train and test sets using 0.2 split ratio.
Dataset split into test and validation sets using 0.2 split ratio.
Size of training set: 43.802
Size of test set: 8.760
Size of validation set: 2.191
|
javaabu/dhivehi-majlis-speech | ---
license: apache-2.0
task_categories:
- automatic-speech-recognition
- text-to-speech
language:
- dv
tags:
- audio
- dhivehi
- speech
- majlis
- parliament
- political
size_categories:
- 1K<n<10K
---
# Dataset Card for Dhivehi Majlis Speech 1.0
### Dataset Summary
Dhivehi Majlis Speech is a Dhivehi speech dataset created from data annotated by [Javaabu Pvt. Ltd.](https://javaabu.com).
The dataset contains around 10.5 hrs of speech collected from parliament sessions at The Peoples Majlis of Maldives (Maldivian Parliament) consisting of audio from different MPs from 6 different sessions.
### Supported Tasks and Leaderboards
- Automatic Speech Recognition
- Text-to-Speech
### Languages
Dhivehi
## Dataset Structure
### Data Instances
A typical data point comprises the path to the audio file and its sentence.
```json
{
'path': 'dhivehi-majlis-speech-train/waves/majlis_jalsa_24_11_2021_p9_73.wav',
'sentence': 'މިއަދު އަންނަމުންދާ ތަރައްޤީއަކީ، އަޅުގަނޑުމެންގެ ރައްޔިތުން އެތައް ދުވަހެއް ވަންދެން ކުރަމުންދިޔަ އުންމީދުތައް',
'audio': {
'path': 'dhivehi-majlis-speech-train/waves/majlis_jalsa_24_11_2021_p9_73.wav',
'array': array([-0.00048828, -0.00018311, -0.00137329, ..., 0.00079346, 0.00091553, 0.00085449], dtype=float32),
'sampling_rate': 16000
},
}
```
### Data Fields
- path (string): The path to the audio file.
- sentence (string): The transcription for the audio file.
- audio (dict): A dictionary containing the path to the downloaded audio file, the decoded audio array, and the sampling rate. Note that when accessing the audio column: dataset[0]["audio"] the audio file is automatically decoded and resampled to dataset.features["audio"].sampling_rate. Decoding and resampling of a large number of audio files might take a significant amount of time. Thus it is important to first query the sample index before the "audio" column, i.e. dataset[0]["audio"] should always be preferred over dataset["audio"][0].
### Data Splits
The speech material has been subdivided into portions for train, test and validation.
| | Train | Validation | Test | Total |
| ---------------- |----------|------------|----------|----------|
| Utterances | 4647 | 580 | 580 | 5807 |
| Duration | 08:23:06 | 01:20:28 | 01:03:55 | 10:47:28 |
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
Data was provided by [The Peoples Majlis of Maldives](https://majlis.gov.mv/).
For transcript normalization, new lines and multiple whitespace has been replaced by single spaces.
Some of the text written in numerals and arabic has been transliterated to Thaana. However, some arabic strings and numerals may still remain.
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
The audio was annotated using [Audino](https://github.com/midas-research/audino)
#### Who are the annotators?
Annotation was done by [Javaabu](https://javaabu.com) staff and hired annotators. These annotators were:
- Hassan Ulvan Mohamed
- Zayan Saudhulla
- Ibrahim Shareef
- Mohamed Jailam
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
```
@misc{Javaabu_2023,
title = "Dhivehi Majlis Speech Dataset",
url = "https://huggingface.co/datasets/javaabu/dhivehi-majlis-speech",
journal = "Hugging Face",
author = {{Javaabu Pvt. Ltd.}},
year = "2023",
month = jul
}
```
### Contributions
- [Arushad Ahmed](https://arushad.org)
- [Mohamed Jailam](https://github.com/muhammedjailam)
- [Ibrahim Shareef](https://github.com/ihshareef) |
mahdibaghbanzadeh/GUE_EMP_H3K4me2 | ---
dataset_info:
features:
- name: sequence
dtype: string
- name: labels
dtype:
class_label:
names:
'0': '0'
'1': '1'
splits:
- name: train
num_bytes: 12566640
num_examples: 24545
- name: val
num_bytes: 1571328
num_examples: 3069
- name: test
num_bytes: 1571328
num_examples: 3069
download_size: 7416429
dataset_size: 15709296
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: val
path: data/val-*
- split: test
path: data/test-*
---
|
autoevaluate/autoeval-staging-eval-project-squad_v2-e06b4410-11855585 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- squad_v2
eval_info:
task: extractive_question_answering
model: deepset/bert-base-uncased-squad2
metrics: []
dataset_name: squad_v2
dataset_config: squad_v2
dataset_split: validation
col_mapping:
context: context
question: question
answers-text: answers.text
answers-answer_start: answers.answer_start
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Question Answering
* Model: deepset/bert-base-uncased-squad2
* Dataset: squad_v2
* Config: squad_v2
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@sjrlee](https://huggingface.co/sjrlee) for evaluating this model. |
medric49/gpt4llm-mix-1k | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: context
dtype: string
- name: response
dtype: string
- name: rejected
dtype: string
- name: res:airedefined/pythia-14m-dolly-rag
dtype: string
- name: res:airedefined/gpt2-dolly-rag
dtype: string
splits:
- name: train
num_bytes: 1686302
num_examples: 1000
download_size: 558903
dataset_size: 1686302
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "gpt4llm-rm-training"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
LeonardoTiger/Leonardo_Legends.VoiceLines | ---
license: wtfpl
---
|
zolak/twitter_dataset_81_1713187529 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 312309
num_examples: 742
download_size: 151318
dataset_size: 312309
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ChrisRaynor/test_coco_edit | ---
dataset_info:
features:
- name: input_image
dtype: image
- name: edit_prompt
dtype: string
- name: edited_image
dtype: image
splits:
- name: train
num_bytes: 54649358.0
num_examples: 50
download_size: 54651974
dataset_size: 54649358.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
joseluhf11/oct-object-detection-v3-merge | ---
dataset_info:
features:
- name: image
dtype: image
- name: objects
struct:
- name: bbox
sequence:
sequence: int64
- name: categories
sequence: string
splits:
- name: train
num_bytes: 154014595.25
num_examples: 1246
download_size: 71638878
dataset_size: 154014595.25
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "oct-object-detection-v3-merge"
Dataset is composed of images with multiples object detection box in coco format (x,y,w,h). Images are OCT (type of eye scaner) with boxes indicating some features associated to AMD disease.
The unique difference from from v2 is categories field must have as many class label as there are boxes annotated in each image, even if the class label is the same. So for a image with 3 boxes for the same object, must have 3 class labels.
[Source datataset](https://doi.org/10.1101/2023.03.29.534704)
|
open-llm-leaderboard/details_Swisslex__Mixtral-8x7b-DPO-v0.1 | ---
pretty_name: Evaluation run of Swisslex/Mixtral-8x7b-DPO-v0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Swisslex/Mixtral-8x7b-DPO-v0.1](https://huggingface.co/Swisslex/Mixtral-8x7b-DPO-v0.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Swisslex__Mixtral-8x7b-DPO-v0.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-16T00:07:22.506947](https://huggingface.co/datasets/open-llm-leaderboard/details_Swisslex__Mixtral-8x7b-DPO-v0.1/blob/main/results_2024-01-16T00-07-22.506947.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7049846845762283,\n\
\ \"acc_stderr\": 0.03051139695922859,\n \"acc_norm\": 0.7095501952770573,\n\
\ \"acc_norm_stderr\": 0.03110231839596143,\n \"mc1\": 0.42472460220318237,\n\
\ \"mc1_stderr\": 0.01730400095716748,\n \"mc2\": 0.5738038140031454,\n\
\ \"mc2_stderr\": 0.015194846416368368\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6783276450511946,\n \"acc_stderr\": 0.013650488084494162,\n\
\ \"acc_norm\": 0.7090443686006825,\n \"acc_norm_stderr\": 0.013273077865907588\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6859191396136228,\n\
\ \"acc_stderr\": 0.0046320017323329835,\n \"acc_norm\": 0.8761202947619996,\n\
\ \"acc_norm_stderr\": 0.003287709741128806\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6592592592592592,\n\
\ \"acc_stderr\": 0.040943762699967926,\n \"acc_norm\": 0.6592592592592592,\n\
\ \"acc_norm_stderr\": 0.040943762699967926\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7960526315789473,\n \"acc_stderr\": 0.0327900040631005,\n\
\ \"acc_norm\": 0.7960526315789473,\n \"acc_norm_stderr\": 0.0327900040631005\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.73,\n\
\ \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.73,\n \
\ \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7886792452830189,\n \"acc_stderr\": 0.025125766484827845,\n\
\ \"acc_norm\": 0.7886792452830189,\n \"acc_norm_stderr\": 0.025125766484827845\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8194444444444444,\n\
\ \"acc_stderr\": 0.032166008088022675,\n \"acc_norm\": 0.8194444444444444,\n\
\ \"acc_norm_stderr\": 0.032166008088022675\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6878612716763006,\n\
\ \"acc_stderr\": 0.03533133389323657,\n \"acc_norm\": 0.6878612716763006,\n\
\ \"acc_norm_stderr\": 0.03533133389323657\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.49019607843137253,\n \"acc_stderr\": 0.04974229460422817,\n\
\ \"acc_norm\": 0.49019607843137253,\n \"acc_norm_stderr\": 0.04974229460422817\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.81,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.81,\n\
\ \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6680851063829787,\n \"acc_stderr\": 0.030783736757745643,\n\
\ \"acc_norm\": 0.6680851063829787,\n \"acc_norm_stderr\": 0.030783736757745643\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.04434600701584925,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.04434600701584925\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6482758620689655,\n \"acc_stderr\": 0.0397923663749741,\n\
\ \"acc_norm\": 0.6482758620689655,\n \"acc_norm_stderr\": 0.0397923663749741\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4497354497354497,\n \"acc_stderr\": 0.02562085704293665,\n \"\
acc_norm\": 0.4497354497354497,\n \"acc_norm_stderr\": 0.02562085704293665\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5793650793650794,\n\
\ \"acc_stderr\": 0.04415438226743745,\n \"acc_norm\": 0.5793650793650794,\n\
\ \"acc_norm_stderr\": 0.04415438226743745\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237101,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237101\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8225806451612904,\n\
\ \"acc_stderr\": 0.021732540689329286,\n \"acc_norm\": 0.8225806451612904,\n\
\ \"acc_norm_stderr\": 0.021732540689329286\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.6206896551724138,\n \"acc_stderr\": 0.034139638059062345,\n\
\ \"acc_norm\": 0.6206896551724138,\n \"acc_norm_stderr\": 0.034139638059062345\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\"\
: 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8121212121212121,\n \"acc_stderr\": 0.03050193405942914,\n\
\ \"acc_norm\": 0.8121212121212121,\n \"acc_norm_stderr\": 0.03050193405942914\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8686868686868687,\n \"acc_stderr\": 0.02406315641682252,\n \"\
acc_norm\": 0.8686868686868687,\n \"acc_norm_stderr\": 0.02406315641682252\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9378238341968912,\n \"acc_stderr\": 0.017426974154240528,\n\
\ \"acc_norm\": 0.9378238341968912,\n \"acc_norm_stderr\": 0.017426974154240528\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6846153846153846,\n \"acc_stderr\": 0.02355964698318995,\n \
\ \"acc_norm\": 0.6846153846153846,\n \"acc_norm_stderr\": 0.02355964698318995\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.40370370370370373,\n \"acc_stderr\": 0.029914812342227627,\n \
\ \"acc_norm\": 0.40370370370370373,\n \"acc_norm_stderr\": 0.029914812342227627\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.8025210084033614,\n \"acc_stderr\": 0.02585916412205147,\n \
\ \"acc_norm\": 0.8025210084033614,\n \"acc_norm_stderr\": 0.02585916412205147\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.44370860927152317,\n \"acc_stderr\": 0.040565279022817306,\n \"\
acc_norm\": 0.44370860927152317,\n \"acc_norm_stderr\": 0.040565279022817306\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8770642201834863,\n \"acc_stderr\": 0.014078467983673376,\n \"\
acc_norm\": 0.8770642201834863,\n \"acc_norm_stderr\": 0.014078467983673376\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6157407407407407,\n \"acc_stderr\": 0.03317354514310742,\n \"\
acc_norm\": 0.6157407407407407,\n \"acc_norm_stderr\": 0.03317354514310742\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8529411764705882,\n \"acc_stderr\": 0.02485747808025046,\n \"\
acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.02485747808025046\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8649789029535865,\n \"acc_stderr\": 0.022245776632003694,\n \
\ \"acc_norm\": 0.8649789029535865,\n \"acc_norm_stderr\": 0.022245776632003694\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7668161434977578,\n\
\ \"acc_stderr\": 0.028380391147094702,\n \"acc_norm\": 0.7668161434977578,\n\
\ \"acc_norm_stderr\": 0.028380391147094702\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8244274809160306,\n \"acc_stderr\": 0.03336820338476076,\n\
\ \"acc_norm\": 0.8244274809160306,\n \"acc_norm_stderr\": 0.03336820338476076\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8429752066115702,\n \"acc_stderr\": 0.03321244842547128,\n \"\
acc_norm\": 0.8429752066115702,\n \"acc_norm_stderr\": 0.03321244842547128\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8240740740740741,\n\
\ \"acc_stderr\": 0.036809181416738807,\n \"acc_norm\": 0.8240740740740741,\n\
\ \"acc_norm_stderr\": 0.036809181416738807\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.803680981595092,\n \"acc_stderr\": 0.031207970394709225,\n\
\ \"acc_norm\": 0.803680981595092,\n \"acc_norm_stderr\": 0.031207970394709225\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.03586594738573974,\n\
\ \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.03586594738573974\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9145299145299145,\n\
\ \"acc_stderr\": 0.018315891685625852,\n \"acc_norm\": 0.9145299145299145,\n\
\ \"acc_norm_stderr\": 0.018315891685625852\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8748403575989783,\n\
\ \"acc_stderr\": 0.011832954239305733,\n \"acc_norm\": 0.8748403575989783,\n\
\ \"acc_norm_stderr\": 0.011832954239305733\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7947976878612717,\n \"acc_stderr\": 0.021742519835276277,\n\
\ \"acc_norm\": 0.7947976878612717,\n \"acc_norm_stderr\": 0.021742519835276277\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.40782122905027934,\n\
\ \"acc_stderr\": 0.016435865260914742,\n \"acc_norm\": 0.40782122905027934,\n\
\ \"acc_norm_stderr\": 0.016435865260914742\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7843137254901961,\n \"acc_stderr\": 0.02355083135199509,\n\
\ \"acc_norm\": 0.7843137254901961,\n \"acc_norm_stderr\": 0.02355083135199509\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8006430868167203,\n\
\ \"acc_stderr\": 0.022691033780549656,\n \"acc_norm\": 0.8006430868167203,\n\
\ \"acc_norm_stderr\": 0.022691033780549656\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8302469135802469,\n \"acc_stderr\": 0.02088869041409387,\n\
\ \"acc_norm\": 0.8302469135802469,\n \"acc_norm_stderr\": 0.02088869041409387\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5531914893617021,\n \"acc_stderr\": 0.02965823509766691,\n \
\ \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.02965823509766691\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5319426336375489,\n\
\ \"acc_stderr\": 0.012744149704869647,\n \"acc_norm\": 0.5319426336375489,\n\
\ \"acc_norm_stderr\": 0.012744149704869647\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7904411764705882,\n \"acc_stderr\": 0.02472311040767707,\n\
\ \"acc_norm\": 0.7904411764705882,\n \"acc_norm_stderr\": 0.02472311040767707\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7679738562091504,\n \"acc_stderr\": 0.017077373377856923,\n \
\ \"acc_norm\": 0.7679738562091504,\n \"acc_norm_stderr\": 0.017077373377856923\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n\
\ \"acc_stderr\": 0.04309118709946458,\n \"acc_norm\": 0.7181818181818181,\n\
\ \"acc_norm_stderr\": 0.04309118709946458\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8656716417910447,\n\
\ \"acc_stderr\": 0.024112678240900798,\n \"acc_norm\": 0.8656716417910447,\n\
\ \"acc_norm_stderr\": 0.024112678240900798\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826369,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826369\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.9005847953216374,\n \"acc_stderr\": 0.022949025579355044,\n\
\ \"acc_norm\": 0.9005847953216374,\n \"acc_norm_stderr\": 0.022949025579355044\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.42472460220318237,\n\
\ \"mc1_stderr\": 0.01730400095716748,\n \"mc2\": 0.5738038140031454,\n\
\ \"mc2_stderr\": 0.015194846416368368\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.823993685872139,\n \"acc_stderr\": 0.010703090882320708\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5375284306292646,\n \
\ \"acc_stderr\": 0.013733636059107757\n }\n}\n```"
repo_url: https://huggingface.co/Swisslex/Mixtral-8x7b-DPO-v0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_16T00_07_22.506947
path:
- '**/details_harness|arc:challenge|25_2024-01-16T00-07-22.506947.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-16T00-07-22.506947.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_16T00_07_22.506947
path:
- '**/details_harness|gsm8k|5_2024-01-16T00-07-22.506947.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-16T00-07-22.506947.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_16T00_07_22.506947
path:
- '**/details_harness|hellaswag|10_2024-01-16T00-07-22.506947.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-16T00-07-22.506947.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_16T00_07_22.506947
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T00-07-22.506947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T00-07-22.506947.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T00-07-22.506947.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_16T00_07_22.506947
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T00-07-22.506947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T00-07-22.506947.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_16T00_07_22.506947
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T00-07-22.506947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T00-07-22.506947.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_16T00_07_22.506947
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T00-07-22.506947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T00-07-22.506947.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_16T00_07_22.506947
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T00-07-22.506947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T00-07-22.506947.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_16T00_07_22.506947
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T00-07-22.506947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T00-07-22.506947.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_16T00_07_22.506947
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T00-07-22.506947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T00-07-22.506947.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_16T00_07_22.506947
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T00-07-22.506947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T00-07-22.506947.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_16T00_07_22.506947
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T00-07-22.506947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T00-07-22.506947.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_16T00_07_22.506947
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T00-07-22.506947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T00-07-22.506947.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_16T00_07_22.506947
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T00-07-22.506947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T00-07-22.506947.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_16T00_07_22.506947
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T00-07-22.506947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T00-07-22.506947.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_16T00_07_22.506947
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T00-07-22.506947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T00-07-22.506947.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_16T00_07_22.506947
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T00-07-22.506947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T00-07-22.506947.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_16T00_07_22.506947
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T00-07-22.506947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T00-07-22.506947.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_16T00_07_22.506947
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T00-07-22.506947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T00-07-22.506947.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_16T00_07_22.506947
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T00-07-22.506947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T00-07-22.506947.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_16T00_07_22.506947
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T00-07-22.506947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T00-07-22.506947.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_16T00_07_22.506947
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T00-07-22.506947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T00-07-22.506947.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_16T00_07_22.506947
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T00-07-22.506947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T00-07-22.506947.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_16T00_07_22.506947
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T00-07-22.506947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T00-07-22.506947.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_16T00_07_22.506947
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T00-07-22.506947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T00-07-22.506947.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_16T00_07_22.506947
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T00-07-22.506947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T00-07-22.506947.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_16T00_07_22.506947
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T00-07-22.506947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T00-07-22.506947.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_16T00_07_22.506947
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T00-07-22.506947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T00-07-22.506947.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_16T00_07_22.506947
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T00-07-22.506947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T00-07-22.506947.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_16T00_07_22.506947
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T00-07-22.506947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T00-07-22.506947.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_16T00_07_22.506947
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T00-07-22.506947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T00-07-22.506947.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_16T00_07_22.506947
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T00-07-22.506947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T00-07-22.506947.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_16T00_07_22.506947
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T00-07-22.506947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T00-07-22.506947.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_16T00_07_22.506947
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T00-07-22.506947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T00-07-22.506947.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_16T00_07_22.506947
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T00-07-22.506947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T00-07-22.506947.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_16T00_07_22.506947
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T00-07-22.506947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T00-07-22.506947.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_16T00_07_22.506947
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T00-07-22.506947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T00-07-22.506947.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_16T00_07_22.506947
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T00-07-22.506947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T00-07-22.506947.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_16T00_07_22.506947
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T00-07-22.506947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T00-07-22.506947.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_16T00_07_22.506947
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T00-07-22.506947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T00-07-22.506947.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_16T00_07_22.506947
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T00-07-22.506947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T00-07-22.506947.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_16T00_07_22.506947
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T00-07-22.506947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T00-07-22.506947.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_16T00_07_22.506947
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-16T00-07-22.506947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-16T00-07-22.506947.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_16T00_07_22.506947
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T00-07-22.506947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T00-07-22.506947.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_16T00_07_22.506947
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T00-07-22.506947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T00-07-22.506947.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_16T00_07_22.506947
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T00-07-22.506947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T00-07-22.506947.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_16T00_07_22.506947
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T00-07-22.506947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T00-07-22.506947.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_16T00_07_22.506947
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T00-07-22.506947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T00-07-22.506947.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_16T00_07_22.506947
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T00-07-22.506947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T00-07-22.506947.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_16T00_07_22.506947
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T00-07-22.506947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T00-07-22.506947.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_16T00_07_22.506947
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T00-07-22.506947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T00-07-22.506947.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_16T00_07_22.506947
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T00-07-22.506947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T00-07-22.506947.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_16T00_07_22.506947
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T00-07-22.506947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T00-07-22.506947.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_16T00_07_22.506947
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T00-07-22.506947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T00-07-22.506947.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_16T00_07_22.506947
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T00-07-22.506947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T00-07-22.506947.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_16T00_07_22.506947
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T00-07-22.506947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T00-07-22.506947.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_16T00_07_22.506947
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T00-07-22.506947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T00-07-22.506947.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_16T00_07_22.506947
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T00-07-22.506947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T00-07-22.506947.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_16T00_07_22.506947
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T00-07-22.506947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T00-07-22.506947.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_16T00_07_22.506947
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T00-07-22.506947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T00-07-22.506947.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_16T00_07_22.506947
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T00-07-22.506947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T00-07-22.506947.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_16T00_07_22.506947
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-16T00-07-22.506947.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-16T00-07-22.506947.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_16T00_07_22.506947
path:
- '**/details_harness|winogrande|5_2024-01-16T00-07-22.506947.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-16T00-07-22.506947.parquet'
- config_name: results
data_files:
- split: 2024_01_16T00_07_22.506947
path:
- results_2024-01-16T00-07-22.506947.parquet
- split: latest
path:
- results_2024-01-16T00-07-22.506947.parquet
---
# Dataset Card for Evaluation run of Swisslex/Mixtral-8x7b-DPO-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Swisslex/Mixtral-8x7b-DPO-v0.1](https://huggingface.co/Swisslex/Mixtral-8x7b-DPO-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Swisslex__Mixtral-8x7b-DPO-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-16T00:07:22.506947](https://huggingface.co/datasets/open-llm-leaderboard/details_Swisslex__Mixtral-8x7b-DPO-v0.1/blob/main/results_2024-01-16T00-07-22.506947.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7049846845762283,
"acc_stderr": 0.03051139695922859,
"acc_norm": 0.7095501952770573,
"acc_norm_stderr": 0.03110231839596143,
"mc1": 0.42472460220318237,
"mc1_stderr": 0.01730400095716748,
"mc2": 0.5738038140031454,
"mc2_stderr": 0.015194846416368368
},
"harness|arc:challenge|25": {
"acc": 0.6783276450511946,
"acc_stderr": 0.013650488084494162,
"acc_norm": 0.7090443686006825,
"acc_norm_stderr": 0.013273077865907588
},
"harness|hellaswag|10": {
"acc": 0.6859191396136228,
"acc_stderr": 0.0046320017323329835,
"acc_norm": 0.8761202947619996,
"acc_norm_stderr": 0.003287709741128806
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6592592592592592,
"acc_stderr": 0.040943762699967926,
"acc_norm": 0.6592592592592592,
"acc_norm_stderr": 0.040943762699967926
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7960526315789473,
"acc_stderr": 0.0327900040631005,
"acc_norm": 0.7960526315789473,
"acc_norm_stderr": 0.0327900040631005
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7886792452830189,
"acc_stderr": 0.025125766484827845,
"acc_norm": 0.7886792452830189,
"acc_norm_stderr": 0.025125766484827845
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8194444444444444,
"acc_stderr": 0.032166008088022675,
"acc_norm": 0.8194444444444444,
"acc_norm_stderr": 0.032166008088022675
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.03533133389323657,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.03533133389323657
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.49019607843137253,
"acc_stderr": 0.04974229460422817,
"acc_norm": 0.49019607843137253,
"acc_norm_stderr": 0.04974229460422817
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6680851063829787,
"acc_stderr": 0.030783736757745643,
"acc_norm": 0.6680851063829787,
"acc_norm_stderr": 0.030783736757745643
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04434600701584925,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04434600701584925
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6482758620689655,
"acc_stderr": 0.0397923663749741,
"acc_norm": 0.6482758620689655,
"acc_norm_stderr": 0.0397923663749741
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4497354497354497,
"acc_stderr": 0.02562085704293665,
"acc_norm": 0.4497354497354497,
"acc_norm_stderr": 0.02562085704293665
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5793650793650794,
"acc_stderr": 0.04415438226743745,
"acc_norm": 0.5793650793650794,
"acc_norm_stderr": 0.04415438226743745
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237101,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237101
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8225806451612904,
"acc_stderr": 0.021732540689329286,
"acc_norm": 0.8225806451612904,
"acc_norm_stderr": 0.021732540689329286
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6206896551724138,
"acc_stderr": 0.034139638059062345,
"acc_norm": 0.6206896551724138,
"acc_norm_stderr": 0.034139638059062345
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8121212121212121,
"acc_stderr": 0.03050193405942914,
"acc_norm": 0.8121212121212121,
"acc_norm_stderr": 0.03050193405942914
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8686868686868687,
"acc_stderr": 0.02406315641682252,
"acc_norm": 0.8686868686868687,
"acc_norm_stderr": 0.02406315641682252
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9378238341968912,
"acc_stderr": 0.017426974154240528,
"acc_norm": 0.9378238341968912,
"acc_norm_stderr": 0.017426974154240528
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6846153846153846,
"acc_stderr": 0.02355964698318995,
"acc_norm": 0.6846153846153846,
"acc_norm_stderr": 0.02355964698318995
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.40370370370370373,
"acc_stderr": 0.029914812342227627,
"acc_norm": 0.40370370370370373,
"acc_norm_stderr": 0.029914812342227627
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8025210084033614,
"acc_stderr": 0.02585916412205147,
"acc_norm": 0.8025210084033614,
"acc_norm_stderr": 0.02585916412205147
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.44370860927152317,
"acc_stderr": 0.040565279022817306,
"acc_norm": 0.44370860927152317,
"acc_norm_stderr": 0.040565279022817306
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8770642201834863,
"acc_stderr": 0.014078467983673376,
"acc_norm": 0.8770642201834863,
"acc_norm_stderr": 0.014078467983673376
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6157407407407407,
"acc_stderr": 0.03317354514310742,
"acc_norm": 0.6157407407407407,
"acc_norm_stderr": 0.03317354514310742
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.02485747808025046,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.02485747808025046
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8649789029535865,
"acc_stderr": 0.022245776632003694,
"acc_norm": 0.8649789029535865,
"acc_norm_stderr": 0.022245776632003694
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7668161434977578,
"acc_stderr": 0.028380391147094702,
"acc_norm": 0.7668161434977578,
"acc_norm_stderr": 0.028380391147094702
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8244274809160306,
"acc_stderr": 0.03336820338476076,
"acc_norm": 0.8244274809160306,
"acc_norm_stderr": 0.03336820338476076
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8429752066115702,
"acc_stderr": 0.03321244842547128,
"acc_norm": 0.8429752066115702,
"acc_norm_stderr": 0.03321244842547128
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8240740740740741,
"acc_stderr": 0.036809181416738807,
"acc_norm": 0.8240740740740741,
"acc_norm_stderr": 0.036809181416738807
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.803680981595092,
"acc_stderr": 0.031207970394709225,
"acc_norm": 0.803680981595092,
"acc_norm_stderr": 0.031207970394709225
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.8446601941747572,
"acc_stderr": 0.03586594738573974,
"acc_norm": 0.8446601941747572,
"acc_norm_stderr": 0.03586594738573974
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9145299145299145,
"acc_stderr": 0.018315891685625852,
"acc_norm": 0.9145299145299145,
"acc_norm_stderr": 0.018315891685625852
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8748403575989783,
"acc_stderr": 0.011832954239305733,
"acc_norm": 0.8748403575989783,
"acc_norm_stderr": 0.011832954239305733
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7947976878612717,
"acc_stderr": 0.021742519835276277,
"acc_norm": 0.7947976878612717,
"acc_norm_stderr": 0.021742519835276277
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.40782122905027934,
"acc_stderr": 0.016435865260914742,
"acc_norm": 0.40782122905027934,
"acc_norm_stderr": 0.016435865260914742
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7843137254901961,
"acc_stderr": 0.02355083135199509,
"acc_norm": 0.7843137254901961,
"acc_norm_stderr": 0.02355083135199509
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8006430868167203,
"acc_stderr": 0.022691033780549656,
"acc_norm": 0.8006430868167203,
"acc_norm_stderr": 0.022691033780549656
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8302469135802469,
"acc_stderr": 0.02088869041409387,
"acc_norm": 0.8302469135802469,
"acc_norm_stderr": 0.02088869041409387
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.02965823509766691,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.02965823509766691
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5319426336375489,
"acc_stderr": 0.012744149704869647,
"acc_norm": 0.5319426336375489,
"acc_norm_stderr": 0.012744149704869647
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7904411764705882,
"acc_stderr": 0.02472311040767707,
"acc_norm": 0.7904411764705882,
"acc_norm_stderr": 0.02472311040767707
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7679738562091504,
"acc_stderr": 0.017077373377856923,
"acc_norm": 0.7679738562091504,
"acc_norm_stderr": 0.017077373377856923
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.04309118709946458,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.04309118709946458
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8656716417910447,
"acc_stderr": 0.024112678240900798,
"acc_norm": 0.8656716417910447,
"acc_norm_stderr": 0.024112678240900798
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826369,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826369
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.9005847953216374,
"acc_stderr": 0.022949025579355044,
"acc_norm": 0.9005847953216374,
"acc_norm_stderr": 0.022949025579355044
},
"harness|truthfulqa:mc|0": {
"mc1": 0.42472460220318237,
"mc1_stderr": 0.01730400095716748,
"mc2": 0.5738038140031454,
"mc2_stderr": 0.015194846416368368
},
"harness|winogrande|5": {
"acc": 0.823993685872139,
"acc_stderr": 0.010703090882320708
},
"harness|gsm8k|5": {
"acc": 0.5375284306292646,
"acc_stderr": 0.013733636059107757
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
MLCommons/peoples_speech_v1.0 | ---
annotations_creators:
- crowdsourced
- machine-generated
language_creators:
- crowdsourced
- machine-generated
language:
- en
license:
- cc-by-2.0
- cc-by-2.5
- cc-by-3.0
- cc-by-4.0
- cc-by-sa-3.0
- cc-by-sa-4.0
multilinguality:
- monolingual
pretty_name: People's Speech
size_categories:
- 1T<n
source_datasets:
- original
task_categories:
- automatic-speech-recognition
task_ids:
- speech-recognition
- robust-speech-recognition
- noisy-speech-recognition
---
# Dataset Card for People's Speech
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-instances)
- [Data Splits](#data-instances)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Dataset Description
- **Homepage:** https://mlcommons.org/en/peoples-speech/
- **Repository:** https://github.com/mlcommons/peoples-speech
- **Paper:** https://arxiv.org/abs/2111.09344
- **Leaderboard:** [Needs More Information]
- **Point of Contact:** [datasets@mlcommons.org](mailto:datasets@mlcommons.org)
### Dataset Summary
The People's Speech Dataset is among the world's largest English speech recognition corpus today that is licensed for academic and commercial usage under CC-BY-SA and CC-BY 4.0. It includes 30,000+ hours of transcribed speech in English languages with a diverse set of speakers. This open dataset is large enough to train speech-to-text systems and crucially is available with a permissive license.
### Supported Tasks and Leaderboards
[Needs More Information]
### Languages
English
## Dataset Structure
### Data Instances
{
"id": "gov_DOT_uscourts_DOT_scotus_DOT_19-161/gov_DOT_uscourts_DOT_scotus_DOT_19-161_DOT_2020-03-02_DOT_mp3_00002.flac",
"audio": {
"path": "gov_DOT_uscourts_DOT_scotus_DOT_19-161/gov_DOT_uscourts_DOT_scotus_DOT_19-161_DOT_2020-03-02_DOT_mp3_00002.flac"
"array": array([-6.10351562e-05, ...]),
"sampling_rate": 16000
}
"duration_ms": 14490,
"text": "contends that the suspension clause requires a [...]"
}
### Data Fields
{
"id": datasets.Value("string"),
"audio": datasets.Audio(sampling_rate=16_000),
"duration_ms": datasets.Value("int32"),
"text": datasets.Value("string"),
}
### Data Splits
We provide the following configurations for the dataset: `cc-by-clean`, `cc-by-dirty`, `cc-by-sa-clean`, `cc-by-sa-dirty`, and `microset`. We don't provide splits for any of the configurations.
## Dataset Creation
### Curation Rationale
See our [paper](https://arxiv.org/abs/2111.09344).
### Source Data
#### Initial Data Collection and Normalization
Data was downloaded via the archive.org API. No data inference was done.
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
No manual annotation is done. We download only source audio with already existing transcripts.
#### Who are the annotators?
For the test and dev sets, we paid native American English speakers to do transcriptions. We do not know the identities of the transcriptionists for data in the training set. For the training set, we have noticed that some transcriptions are likely to be the output of automatic speech recognition systems.
### Personal and Sensitive Information
Several of our sources are legal and government proceedings, spoken histories, speeches, and so on. Given that these were intended as public documents and licensed as such, it is natural that the involved individuals are aware of this.
## Considerations for Using the Data
### Social Impact of Dataset
The dataset could be used for speech synthesis. However, this requires careful cleaning of the dataset, as background noise is not tolerable for speech synthesis.
The dataset could be used for keyword spotting tasks as well. In particular, this is good use case for the non-English audio in the dataset.
Our sincere hope is that the large breadth of sources our dataset incorporates reduces existing quality of service issues today, like speech recognition system’s poor understanding of non-native English accents. We cannot think of any unfair treatment that come from using this dataset at this time.
### Discussion of Biases
Our data is downloaded from archive.org. As such, the data is biased towards whatever users decide to upload there.
Almost all of our data is American accented English.
### Other Known Limitations
As of version 1.0, a portion of data in the training, test, and dev sets is poorly aligned. Specifically, some words appear in the transcript, but not the audio, or some words appear in the audio, but not the transcript. We are working on it.
## Additional Information
### Dataset Curators
[Needs More Information]
### Licensing Information
We provide CC-BY and CC-BY-SA subsets of the dataset.
### Citation Information
Please cite:
```
@article{DBLP:journals/corr/abs-2111-09344,
author = {Daniel Galvez and
Greg Diamos and
Juan Ciro and
Juan Felipe Cer{\'{o}}n and
Keith Achorn and
Anjali Gopi and
David Kanter and
Maximilian Lam and
Mark Mazumder and
Vijay Janapa Reddi},
title = {The People's Speech: {A} Large-Scale Diverse English Speech Recognition
Dataset for Commercial Usage},
journal = {CoRR},
volume = {abs/2111.09344},
year = {2021},
url = {https://arxiv.org/abs/2111.09344},
eprinttype = {arXiv},
eprint = {2111.09344},
timestamp = {Mon, 22 Nov 2021 16:44:07 +0100},
biburl = {https://dblp.org/rec/journals/corr/abs-2111-09344.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
``` |
dim/HC3_ru | ---
dataset_info:
features:
- name: id
dtype: string
- name: question
dtype: string
- name: human_answers
sequence: string
- name: chatgpt_answers
sequence: string
- name: source
dtype: string
splits:
- name: train
num_bytes: 135406074
num_examples: 24322
download_size: 62378894
dataset_size: 135406074
---
# Dataset Card for "HC3_ru"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_ToastyPigeon__smolphin-test-bottomheavy | ---
pretty_name: Evaluation run of ToastyPigeon/smolphin-test-bottomheavy
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ToastyPigeon/smolphin-test-bottomheavy](https://huggingface.co/ToastyPigeon/smolphin-test-bottomheavy)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ToastyPigeon__smolphin-test-bottomheavy\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-29T21:21:21.801728](https://huggingface.co/datasets/open-llm-leaderboard/details_ToastyPigeon__smolphin-test-bottomheavy/blob/main/results_2024-03-29T21-21-21.801728.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.26379372985202537,\n\
\ \"acc_stderr\": 0.030950618926962145,\n \"acc_norm\": 0.2651864113360157,\n\
\ \"acc_norm_stderr\": 0.03172234413160602,\n \"mc1\": 0.2484700122399021,\n\
\ \"mc1_stderr\": 0.015127427096520667,\n \"mc2\": 0.38487585909607447,\n\
\ \"mc2_stderr\": 0.014094144877528773\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.2901023890784983,\n \"acc_stderr\": 0.013261573677520764,\n\
\ \"acc_norm\": 0.3267918088737201,\n \"acc_norm_stderr\": 0.013706665975587333\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.44293965345548697,\n\
\ \"acc_stderr\": 0.004957182635381807,\n \"acc_norm\": 0.591714797849034,\n\
\ \"acc_norm_stderr\": 0.004905119039849451\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.28888888888888886,\n\
\ \"acc_stderr\": 0.0391545063041425,\n \"acc_norm\": 0.28888888888888886,\n\
\ \"acc_norm_stderr\": 0.0391545063041425\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17105263157894737,\n \"acc_stderr\": 0.030643607071677084,\n\
\ \"acc_norm\": 0.17105263157894737,\n \"acc_norm_stderr\": 0.030643607071677084\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.33,\n\
\ \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \
\ \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2188679245283019,\n \"acc_stderr\": 0.025447863825108614,\n\
\ \"acc_norm\": 0.2188679245283019,\n \"acc_norm_stderr\": 0.025447863825108614\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n\
\ \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.19653179190751446,\n\
\ \"acc_stderr\": 0.030299574664788147,\n \"acc_norm\": 0.19653179190751446,\n\
\ \"acc_norm_stderr\": 0.030299574664788147\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.04440521906179328,\n\
\ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.04440521906179328\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n\
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2765957446808511,\n \"acc_stderr\": 0.029241883869628824,\n\
\ \"acc_norm\": 0.2765957446808511,\n \"acc_norm_stderr\": 0.029241883869628824\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\
\ \"acc_stderr\": 0.04049339297748141,\n \"acc_norm\": 0.24561403508771928,\n\
\ \"acc_norm_stderr\": 0.04049339297748141\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2689655172413793,\n \"acc_stderr\": 0.03695183311650232,\n\
\ \"acc_norm\": 0.2689655172413793,\n \"acc_norm_stderr\": 0.03695183311650232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"\
acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2698412698412698,\n\
\ \"acc_stderr\": 0.03970158273235173,\n \"acc_norm\": 0.2698412698412698,\n\
\ \"acc_norm_stderr\": 0.03970158273235173\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.17,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2967741935483871,\n\
\ \"acc_stderr\": 0.0259885007924119,\n \"acc_norm\": 0.2967741935483871,\n\
\ \"acc_norm_stderr\": 0.0259885007924119\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.26108374384236455,\n \"acc_stderr\": 0.030903796952114475,\n\
\ \"acc_norm\": 0.26108374384236455,\n \"acc_norm_stderr\": 0.030903796952114475\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.2,\n \"acc_stderr\": 0.040201512610368445,\n \"acc_norm\"\
: 0.2,\n \"acc_norm_stderr\": 0.040201512610368445\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21212121212121213,\n \"acc_stderr\": 0.031922715695482995,\n\
\ \"acc_norm\": 0.21212121212121213,\n \"acc_norm_stderr\": 0.031922715695482995\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.3434343434343434,\n \"acc_stderr\": 0.03383201223244444,\n \"\
acc_norm\": 0.3434343434343434,\n \"acc_norm_stderr\": 0.03383201223244444\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.22279792746113988,\n \"acc_stderr\": 0.03003114797764154,\n\
\ \"acc_norm\": 0.22279792746113988,\n \"acc_norm_stderr\": 0.03003114797764154\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.28717948717948716,\n \"acc_stderr\": 0.022939925418530623,\n\
\ \"acc_norm\": 0.28717948717948716,\n \"acc_norm_stderr\": 0.022939925418530623\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25555555555555554,\n \"acc_stderr\": 0.026593939101844072,\n \
\ \"acc_norm\": 0.25555555555555554,\n \"acc_norm_stderr\": 0.026593939101844072\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3067226890756303,\n \"acc_stderr\": 0.02995382389188704,\n \
\ \"acc_norm\": 0.3067226890756303,\n \"acc_norm_stderr\": 0.02995382389188704\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.23841059602649006,\n \"acc_stderr\": 0.0347918557259966,\n \"\
acc_norm\": 0.23841059602649006,\n \"acc_norm_stderr\": 0.0347918557259966\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.30091743119266057,\n \"acc_stderr\": 0.019664751366802114,\n \"\
acc_norm\": 0.30091743119266057,\n \"acc_norm_stderr\": 0.019664751366802114\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.24019607843137256,\n\
\ \"acc_stderr\": 0.02998373305591361,\n \"acc_norm\": 0.24019607843137256,\n\
\ \"acc_norm_stderr\": 0.02998373305591361\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.26582278481012656,\n \"acc_stderr\": 0.028756799629658346,\n\
\ \"acc_norm\": 0.26582278481012656,\n \"acc_norm_stderr\": 0.028756799629658346\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.28699551569506726,\n\
\ \"acc_stderr\": 0.030360379710291954,\n \"acc_norm\": 0.28699551569506726,\n\
\ \"acc_norm_stderr\": 0.030360379710291954\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.25190839694656486,\n \"acc_stderr\": 0.03807387116306086,\n\
\ \"acc_norm\": 0.25190839694656486,\n \"acc_norm_stderr\": 0.03807387116306086\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.23140495867768596,\n \"acc_stderr\": 0.038498560987940876,\n \"\
acc_norm\": 0.23140495867768596,\n \"acc_norm_stderr\": 0.038498560987940876\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.24074074074074073,\n\
\ \"acc_stderr\": 0.041331194402438376,\n \"acc_norm\": 0.24074074074074073,\n\
\ \"acc_norm_stderr\": 0.041331194402438376\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.26380368098159507,\n \"acc_stderr\": 0.03462419931615623,\n\
\ \"acc_norm\": 0.26380368098159507,\n \"acc_norm_stderr\": 0.03462419931615623\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.32142857142857145,\n\
\ \"acc_stderr\": 0.04432804055291518,\n \"acc_norm\": 0.32142857142857145,\n\
\ \"acc_norm_stderr\": 0.04432804055291518\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.1553398058252427,\n \"acc_stderr\": 0.03586594738573971,\n\
\ \"acc_norm\": 0.1553398058252427,\n \"acc_norm_stderr\": 0.03586594738573971\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.21794871794871795,\n\
\ \"acc_stderr\": 0.027046857630716677,\n \"acc_norm\": 0.21794871794871795,\n\
\ \"acc_norm_stderr\": 0.027046857630716677\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2567049808429119,\n\
\ \"acc_stderr\": 0.015620480263064541,\n \"acc_norm\": 0.2567049808429119,\n\
\ \"acc_norm_stderr\": 0.015620480263064541\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.23121387283236994,\n \"acc_stderr\": 0.022698657167855713,\n\
\ \"acc_norm\": 0.23121387283236994,\n \"acc_norm_stderr\": 0.022698657167855713\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.02526169121972949,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.02526169121972949\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.20257234726688103,\n\
\ \"acc_stderr\": 0.022827317491059682,\n \"acc_norm\": 0.20257234726688103,\n\
\ \"acc_norm_stderr\": 0.022827317491059682\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2716049382716049,\n \"acc_stderr\": 0.02474862449053737,\n\
\ \"acc_norm\": 0.2716049382716049,\n \"acc_norm_stderr\": 0.02474862449053737\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.25177304964539005,\n \"acc_stderr\": 0.0258921511567094,\n \
\ \"acc_norm\": 0.25177304964539005,\n \"acc_norm_stderr\": 0.0258921511567094\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24185136897001303,\n\
\ \"acc_stderr\": 0.010936550813827073,\n \"acc_norm\": 0.24185136897001303,\n\
\ \"acc_norm_stderr\": 0.010936550813827073\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.0276784686421447,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.0276784686421447\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.22448979591836735,\n\
\ \"acc_stderr\": 0.02671143055553842,\n \"acc_norm\": 0.22448979591836735,\n\
\ \"acc_norm_stderr\": 0.02671143055553842\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.030360490154014652,\n\
\ \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.030360490154014652\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.38,\n\
\ \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.2469879518072289,\n \"acc_stderr\": 0.03357351982064537,\n\
\ \"acc_norm\": 0.2469879518072289,\n \"acc_norm_stderr\": 0.03357351982064537\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.26900584795321636,\n\
\ \"acc_stderr\": 0.0340105262010409,\n \"acc_norm\": 0.26900584795321636,\n\
\ \"acc_norm_stderr\": 0.0340105262010409\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.2484700122399021,\n \"mc1_stderr\": 0.015127427096520667,\n\
\ \"mc2\": 0.38487585909607447,\n \"mc2_stderr\": 0.014094144877528773\n\
\ },\n \"harness|winogrande|5\": {\n \"acc\": 0.6179952644041041,\n\
\ \"acc_stderr\": 0.013655578215970406\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.01288855193328279,\n \"acc_stderr\": 0.003106901266499669\n\
\ }\n}\n```"
repo_url: https://huggingface.co/ToastyPigeon/smolphin-test-bottomheavy
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_29T21_21_21.801728
path:
- '**/details_harness|arc:challenge|25_2024-03-29T21-21-21.801728.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-29T21-21-21.801728.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_29T21_21_21.801728
path:
- '**/details_harness|gsm8k|5_2024-03-29T21-21-21.801728.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-29T21-21-21.801728.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_29T21_21_21.801728
path:
- '**/details_harness|hellaswag|10_2024-03-29T21-21-21.801728.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-29T21-21-21.801728.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_29T21_21_21.801728
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T21-21-21.801728.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T21-21-21.801728.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T21-21-21.801728.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_29T21_21_21.801728
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T21-21-21.801728.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T21-21-21.801728.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_29T21_21_21.801728
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T21-21-21.801728.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T21-21-21.801728.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_29T21_21_21.801728
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T21-21-21.801728.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T21-21-21.801728.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_29T21_21_21.801728
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T21-21-21.801728.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T21-21-21.801728.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_29T21_21_21.801728
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T21-21-21.801728.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T21-21-21.801728.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_29T21_21_21.801728
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T21-21-21.801728.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T21-21-21.801728.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_29T21_21_21.801728
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T21-21-21.801728.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T21-21-21.801728.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_29T21_21_21.801728
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T21-21-21.801728.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T21-21-21.801728.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_29T21_21_21.801728
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T21-21-21.801728.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T21-21-21.801728.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_29T21_21_21.801728
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T21-21-21.801728.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T21-21-21.801728.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_29T21_21_21.801728
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T21-21-21.801728.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T21-21-21.801728.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_29T21_21_21.801728
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T21-21-21.801728.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T21-21-21.801728.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_29T21_21_21.801728
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T21-21-21.801728.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T21-21-21.801728.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_29T21_21_21.801728
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T21-21-21.801728.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T21-21-21.801728.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_29T21_21_21.801728
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T21-21-21.801728.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T21-21-21.801728.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_29T21_21_21.801728
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T21-21-21.801728.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T21-21-21.801728.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_29T21_21_21.801728
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T21-21-21.801728.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T21-21-21.801728.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_29T21_21_21.801728
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T21-21-21.801728.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T21-21-21.801728.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_29T21_21_21.801728
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T21-21-21.801728.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T21-21-21.801728.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_29T21_21_21.801728
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T21-21-21.801728.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T21-21-21.801728.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_29T21_21_21.801728
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T21-21-21.801728.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T21-21-21.801728.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_29T21_21_21.801728
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T21-21-21.801728.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T21-21-21.801728.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_29T21_21_21.801728
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T21-21-21.801728.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T21-21-21.801728.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_29T21_21_21.801728
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T21-21-21.801728.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T21-21-21.801728.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_29T21_21_21.801728
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T21-21-21.801728.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T21-21-21.801728.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_29T21_21_21.801728
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T21-21-21.801728.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T21-21-21.801728.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_29T21_21_21.801728
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T21-21-21.801728.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T21-21-21.801728.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_29T21_21_21.801728
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T21-21-21.801728.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T21-21-21.801728.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_29T21_21_21.801728
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T21-21-21.801728.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T21-21-21.801728.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_29T21_21_21.801728
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T21-21-21.801728.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T21-21-21.801728.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_29T21_21_21.801728
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T21-21-21.801728.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T21-21-21.801728.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_29T21_21_21.801728
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T21-21-21.801728.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T21-21-21.801728.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_29T21_21_21.801728
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T21-21-21.801728.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T21-21-21.801728.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_29T21_21_21.801728
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T21-21-21.801728.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T21-21-21.801728.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_29T21_21_21.801728
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T21-21-21.801728.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T21-21-21.801728.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_29T21_21_21.801728
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T21-21-21.801728.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T21-21-21.801728.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_29T21_21_21.801728
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T21-21-21.801728.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T21-21-21.801728.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_29T21_21_21.801728
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T21-21-21.801728.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T21-21-21.801728.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_29T21_21_21.801728
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-29T21-21-21.801728.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-29T21-21-21.801728.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_29T21_21_21.801728
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T21-21-21.801728.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T21-21-21.801728.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_29T21_21_21.801728
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T21-21-21.801728.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T21-21-21.801728.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_29T21_21_21.801728
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T21-21-21.801728.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T21-21-21.801728.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_29T21_21_21.801728
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T21-21-21.801728.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T21-21-21.801728.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_29T21_21_21.801728
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T21-21-21.801728.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T21-21-21.801728.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_29T21_21_21.801728
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T21-21-21.801728.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T21-21-21.801728.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_29T21_21_21.801728
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T21-21-21.801728.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T21-21-21.801728.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_29T21_21_21.801728
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T21-21-21.801728.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T21-21-21.801728.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_29T21_21_21.801728
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T21-21-21.801728.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T21-21-21.801728.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_29T21_21_21.801728
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T21-21-21.801728.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T21-21-21.801728.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_29T21_21_21.801728
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T21-21-21.801728.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T21-21-21.801728.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_29T21_21_21.801728
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T21-21-21.801728.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T21-21-21.801728.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_29T21_21_21.801728
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T21-21-21.801728.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T21-21-21.801728.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_29T21_21_21.801728
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T21-21-21.801728.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T21-21-21.801728.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_29T21_21_21.801728
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T21-21-21.801728.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T21-21-21.801728.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_29T21_21_21.801728
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T21-21-21.801728.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T21-21-21.801728.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_29T21_21_21.801728
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T21-21-21.801728.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T21-21-21.801728.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_29T21_21_21.801728
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T21-21-21.801728.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T21-21-21.801728.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_29T21_21_21.801728
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-29T21-21-21.801728.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-29T21-21-21.801728.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_29T21_21_21.801728
path:
- '**/details_harness|winogrande|5_2024-03-29T21-21-21.801728.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-29T21-21-21.801728.parquet'
- config_name: results
data_files:
- split: 2024_03_29T21_21_21.801728
path:
- results_2024-03-29T21-21-21.801728.parquet
- split: latest
path:
- results_2024-03-29T21-21-21.801728.parquet
---
# Dataset Card for Evaluation run of ToastyPigeon/smolphin-test-bottomheavy
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ToastyPigeon/smolphin-test-bottomheavy](https://huggingface.co/ToastyPigeon/smolphin-test-bottomheavy) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ToastyPigeon__smolphin-test-bottomheavy",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-29T21:21:21.801728](https://huggingface.co/datasets/open-llm-leaderboard/details_ToastyPigeon__smolphin-test-bottomheavy/blob/main/results_2024-03-29T21-21-21.801728.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.26379372985202537,
"acc_stderr": 0.030950618926962145,
"acc_norm": 0.2651864113360157,
"acc_norm_stderr": 0.03172234413160602,
"mc1": 0.2484700122399021,
"mc1_stderr": 0.015127427096520667,
"mc2": 0.38487585909607447,
"mc2_stderr": 0.014094144877528773
},
"harness|arc:challenge|25": {
"acc": 0.2901023890784983,
"acc_stderr": 0.013261573677520764,
"acc_norm": 0.3267918088737201,
"acc_norm_stderr": 0.013706665975587333
},
"harness|hellaswag|10": {
"acc": 0.44293965345548697,
"acc_stderr": 0.004957182635381807,
"acc_norm": 0.591714797849034,
"acc_norm_stderr": 0.004905119039849451
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.28888888888888886,
"acc_stderr": 0.0391545063041425,
"acc_norm": 0.28888888888888886,
"acc_norm_stderr": 0.0391545063041425
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17105263157894737,
"acc_stderr": 0.030643607071677084,
"acc_norm": 0.17105263157894737,
"acc_norm_stderr": 0.030643607071677084
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2188679245283019,
"acc_stderr": 0.025447863825108614,
"acc_norm": 0.2188679245283019,
"acc_norm_stderr": 0.025447863825108614
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.19653179190751446,
"acc_stderr": 0.030299574664788147,
"acc_norm": 0.19653179190751446,
"acc_norm_stderr": 0.030299574664788147
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.04440521906179328,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.04440521906179328
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2765957446808511,
"acc_stderr": 0.029241883869628824,
"acc_norm": 0.2765957446808511,
"acc_norm_stderr": 0.029241883869628824
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.04049339297748141,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.04049339297748141
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2689655172413793,
"acc_stderr": 0.03695183311650232,
"acc_norm": 0.2689655172413793,
"acc_norm_stderr": 0.03695183311650232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2698412698412698,
"acc_stderr": 0.03970158273235173,
"acc_norm": 0.2698412698412698,
"acc_norm_stderr": 0.03970158273235173
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.17,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.17,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2967741935483871,
"acc_stderr": 0.0259885007924119,
"acc_norm": 0.2967741935483871,
"acc_norm_stderr": 0.0259885007924119
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.26108374384236455,
"acc_stderr": 0.030903796952114475,
"acc_norm": 0.26108374384236455,
"acc_norm_stderr": 0.030903796952114475
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.2,
"acc_stderr": 0.040201512610368445,
"acc_norm": 0.2,
"acc_norm_stderr": 0.040201512610368445
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21212121212121213,
"acc_stderr": 0.031922715695482995,
"acc_norm": 0.21212121212121213,
"acc_norm_stderr": 0.031922715695482995
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3434343434343434,
"acc_stderr": 0.03383201223244444,
"acc_norm": 0.3434343434343434,
"acc_norm_stderr": 0.03383201223244444
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.22279792746113988,
"acc_stderr": 0.03003114797764154,
"acc_norm": 0.22279792746113988,
"acc_norm_stderr": 0.03003114797764154
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.28717948717948716,
"acc_stderr": 0.022939925418530623,
"acc_norm": 0.28717948717948716,
"acc_norm_stderr": 0.022939925418530623
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25555555555555554,
"acc_stderr": 0.026593939101844072,
"acc_norm": 0.25555555555555554,
"acc_norm_stderr": 0.026593939101844072
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3067226890756303,
"acc_stderr": 0.02995382389188704,
"acc_norm": 0.3067226890756303,
"acc_norm_stderr": 0.02995382389188704
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.23841059602649006,
"acc_stderr": 0.0347918557259966,
"acc_norm": 0.23841059602649006,
"acc_norm_stderr": 0.0347918557259966
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.30091743119266057,
"acc_stderr": 0.019664751366802114,
"acc_norm": 0.30091743119266057,
"acc_norm_stderr": 0.019664751366802114
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24019607843137256,
"acc_stderr": 0.02998373305591361,
"acc_norm": 0.24019607843137256,
"acc_norm_stderr": 0.02998373305591361
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.26582278481012656,
"acc_stderr": 0.028756799629658346,
"acc_norm": 0.26582278481012656,
"acc_norm_stderr": 0.028756799629658346
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.28699551569506726,
"acc_stderr": 0.030360379710291954,
"acc_norm": 0.28699551569506726,
"acc_norm_stderr": 0.030360379710291954
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.25190839694656486,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.25190839694656486,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.23140495867768596,
"acc_stderr": 0.038498560987940876,
"acc_norm": 0.23140495867768596,
"acc_norm_stderr": 0.038498560987940876
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.041331194402438376,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.041331194402438376
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.26380368098159507,
"acc_stderr": 0.03462419931615623,
"acc_norm": 0.26380368098159507,
"acc_norm_stderr": 0.03462419931615623
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.32142857142857145,
"acc_stderr": 0.04432804055291518,
"acc_norm": 0.32142857142857145,
"acc_norm_stderr": 0.04432804055291518
},
"harness|hendrycksTest-management|5": {
"acc": 0.1553398058252427,
"acc_stderr": 0.03586594738573971,
"acc_norm": 0.1553398058252427,
"acc_norm_stderr": 0.03586594738573971
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.21794871794871795,
"acc_stderr": 0.027046857630716677,
"acc_norm": 0.21794871794871795,
"acc_norm_stderr": 0.027046857630716677
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2567049808429119,
"acc_stderr": 0.015620480263064541,
"acc_norm": 0.2567049808429119,
"acc_norm_stderr": 0.015620480263064541
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.23121387283236994,
"acc_stderr": 0.022698657167855713,
"acc_norm": 0.23121387283236994,
"acc_norm_stderr": 0.022698657167855713
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.02526169121972949,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.02526169121972949
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.20257234726688103,
"acc_stderr": 0.022827317491059682,
"acc_norm": 0.20257234726688103,
"acc_norm_stderr": 0.022827317491059682
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2716049382716049,
"acc_stderr": 0.02474862449053737,
"acc_norm": 0.2716049382716049,
"acc_norm_stderr": 0.02474862449053737
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.25177304964539005,
"acc_stderr": 0.0258921511567094,
"acc_norm": 0.25177304964539005,
"acc_norm_stderr": 0.0258921511567094
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24185136897001303,
"acc_stderr": 0.010936550813827073,
"acc_norm": 0.24185136897001303,
"acc_norm_stderr": 0.010936550813827073
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.0276784686421447,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.0276784686421447
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.22448979591836735,
"acc_stderr": 0.02671143055553842,
"acc_norm": 0.22448979591836735,
"acc_norm_stderr": 0.02671143055553842
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.030360490154014652,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.030360490154014652
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2469879518072289,
"acc_stderr": 0.03357351982064537,
"acc_norm": 0.2469879518072289,
"acc_norm_stderr": 0.03357351982064537
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.26900584795321636,
"acc_stderr": 0.0340105262010409,
"acc_norm": 0.26900584795321636,
"acc_norm_stderr": 0.0340105262010409
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2484700122399021,
"mc1_stderr": 0.015127427096520667,
"mc2": 0.38487585909607447,
"mc2_stderr": 0.014094144877528773
},
"harness|winogrande|5": {
"acc": 0.6179952644041041,
"acc_stderr": 0.013655578215970406
},
"harness|gsm8k|5": {
"acc": 0.01288855193328279,
"acc_stderr": 0.003106901266499669
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
YaduvanshiAnkit/SanskritShloka3 | ---
license: mit
---
|
rickRossie/dataV1 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
- name: messages
dtype: string
- name: characterName
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 2967685851
num_examples: 58249
download_size: 1553089676
dataset_size: 2967685851
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
pccl-org/formal-logic-simple-order-token-objects-paired-relationship-0-2000 | ---
dataset_info:
features:
- name: greater_than
sequence: int64
- name: less_than
sequence: int64
- name: paired_example
sequence:
sequence:
sequence: int64
- name: correct_example
sequence:
sequence: int64
- name: incorrect_example
sequence:
sequence: int64
- name: distance
dtype: int64
- name: index
dtype: int64
- name: index_in_distance
dtype: int64
splits:
- name: train
num_bytes: 184969080
num_examples: 873250
download_size: 49354220
dataset_size: 184969080
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
houck2040/videos | ---
license: mit
---
Videos have been created for documentation and testing purposes.
Some of them may be helpful to someone somewhere.
This message was not generated by AI.
|
maghwa/OpenHermes-2-AR-10K-30-720k-730k | ---
dataset_info:
features:
- name: source
dtype: string
- name: hash
dtype: 'null'
- name: category
dtype: 'null'
- name: system_prompt
dtype: 'null'
- name: model_name
dtype: 'null'
- name: language
dtype: 'null'
- name: views
dtype: float64
- name: conversations
dtype: string
- name: topic
dtype: 'null'
- name: id
dtype: 'null'
- name: avatarUrl
dtype: 'null'
- name: custom_instruction
dtype: 'null'
- name: skip_prompt_formatting
dtype: 'null'
- name: idx
dtype: 'null'
- name: title
dtype: 'null'
- name: model
dtype: 'null'
splits:
- name: train
num_bytes: 25337285
num_examples: 10001
download_size: 11474223
dataset_size: 25337285
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
bbokyeong/koalpaca-final | ---
license: unknown
---
|
Kesav22/biddinghub | ---
task_categories:
- conversational
- question-answering
tags:
- not-for-all-audiences
size_categories:
- n<1K
--- |
open-llm-leaderboard/details_nlpguy__ColorShadow-7B-v2 | ---
pretty_name: Evaluation run of nlpguy/ColorShadow-7B-v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [nlpguy/ColorShadow-7B-v2](https://huggingface.co/nlpguy/ColorShadow-7B-v2) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_nlpguy__ColorShadow-7B-v2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-30T17:39:38.883632](https://huggingface.co/datasets/open-llm-leaderboard/details_nlpguy__ColorShadow-7B-v2/blob/main/results_2023-12-30T17-39-38.883632.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6052838646829277,\n\
\ \"acc_stderr\": 0.03314929477593008,\n \"acc_norm\": 0.60872976450195,\n\
\ \"acc_norm_stderr\": 0.033818410657125174,\n \"mc1\": 0.4614443084455324,\n\
\ \"mc1_stderr\": 0.017451384104637455,\n \"mc2\": 0.6292738740677482,\n\
\ \"mc2_stderr\": 0.014921407429654188\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6313993174061433,\n \"acc_stderr\": 0.014097810678042198,\n\
\ \"acc_norm\": 0.6715017064846417,\n \"acc_norm_stderr\": 0.013724978465537298\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.632742481577375,\n\
\ \"acc_stderr\": 0.004810723108378217,\n \"acc_norm\": 0.8469428400716988,\n\
\ \"acc_norm_stderr\": 0.003593067633474304\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5481481481481482,\n\
\ \"acc_stderr\": 0.04299268905480864,\n \"acc_norm\": 0.5481481481481482,\n\
\ \"acc_norm_stderr\": 0.04299268905480864\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.0373852067611967,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.0373852067611967\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6716981132075471,\n \"acc_stderr\": 0.02890159361241178,\n\
\ \"acc_norm\": 0.6716981132075471,\n \"acc_norm_stderr\": 0.02890159361241178\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.03942082639927213,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.03942082639927213\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\"\
: 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n\
\ \"acc_stderr\": 0.03669072477416907,\n \"acc_norm\": 0.6358381502890174,\n\
\ \"acc_norm_stderr\": 0.03669072477416907\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5276595744680851,\n \"acc_stderr\": 0.03263597118409769,\n\
\ \"acc_norm\": 0.5276595744680851,\n \"acc_norm_stderr\": 0.03263597118409769\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n\
\ \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n\
\ \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.36507936507936506,\n \"acc_stderr\": 0.02479606060269995,\n \"\
acc_norm\": 0.36507936507936506,\n \"acc_norm_stderr\": 0.02479606060269995\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6580645161290323,\n\
\ \"acc_stderr\": 0.02698528957655273,\n \"acc_norm\": 0.6580645161290323,\n\
\ \"acc_norm_stderr\": 0.02698528957655273\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4729064039408867,\n \"acc_stderr\": 0.03512819077876106,\n\
\ \"acc_norm\": 0.4729064039408867,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\"\
: 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.0347769116216366,\n\
\ \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.0347769116216366\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7373737373737373,\n \"acc_stderr\": 0.03135305009533086,\n \"\
acc_norm\": 0.7373737373737373,\n \"acc_norm_stderr\": 0.03135305009533086\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8549222797927462,\n \"acc_stderr\": 0.02541634309630641,\n\
\ \"acc_norm\": 0.8549222797927462,\n \"acc_norm_stderr\": 0.02541634309630641\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6153846153846154,\n \"acc_stderr\": 0.02466674491518721,\n \
\ \"acc_norm\": 0.6153846153846154,\n \"acc_norm_stderr\": 0.02466674491518721\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32592592592592595,\n \"acc_stderr\": 0.02857834836547308,\n \
\ \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.02857834836547308\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6512605042016807,\n \"acc_stderr\": 0.030956636328566545,\n\
\ \"acc_norm\": 0.6512605042016807,\n \"acc_norm_stderr\": 0.030956636328566545\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8220183486238533,\n \"acc_stderr\": 0.016399436366612903,\n \"\
acc_norm\": 0.8220183486238533,\n \"acc_norm_stderr\": 0.016399436366612903\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4675925925925926,\n \"acc_stderr\": 0.03402801581358966,\n \"\
acc_norm\": 0.4675925925925926,\n \"acc_norm_stderr\": 0.03402801581358966\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7450980392156863,\n \"acc_stderr\": 0.03058759135160425,\n \"\
acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.03058759135160425\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7637130801687764,\n \"acc_stderr\": 0.027652153144159263,\n \
\ \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.027652153144159263\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.732824427480916,\n \"acc_stderr\": 0.03880848301082394,\n\
\ \"acc_norm\": 0.732824427480916,\n \"acc_norm_stderr\": 0.03880848301082394\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8181818181818182,\n \"acc_stderr\": 0.03520893951097652,\n \"\
acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03520893951097652\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7239263803680982,\n \"acc_stderr\": 0.035123852837050475,\n\
\ \"acc_norm\": 0.7239263803680982,\n \"acc_norm_stderr\": 0.035123852837050475\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8461538461538461,\n\
\ \"acc_stderr\": 0.023636873317489288,\n \"acc_norm\": 0.8461538461538461,\n\
\ \"acc_norm_stderr\": 0.023636873317489288\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \
\ \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7943805874840357,\n\
\ \"acc_stderr\": 0.01445250045678583,\n \"acc_norm\": 0.7943805874840357,\n\
\ \"acc_norm_stderr\": 0.01445250045678583\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6676300578034682,\n \"acc_stderr\": 0.02536116874968822,\n\
\ \"acc_norm\": 0.6676300578034682,\n \"acc_norm_stderr\": 0.02536116874968822\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.30726256983240224,\n\
\ \"acc_stderr\": 0.015430158846469606,\n \"acc_norm\": 0.30726256983240224,\n\
\ \"acc_norm_stderr\": 0.015430158846469606\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.026992544339297247,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.026992544339297247\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6784565916398714,\n\
\ \"acc_stderr\": 0.026527724079528872,\n \"acc_norm\": 0.6784565916398714,\n\
\ \"acc_norm_stderr\": 0.026527724079528872\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.654320987654321,\n \"acc_stderr\": 0.02646248777700187,\n\
\ \"acc_norm\": 0.654320987654321,\n \"acc_norm_stderr\": 0.02646248777700187\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303062,\n \
\ \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303062\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44328552803129073,\n\
\ \"acc_stderr\": 0.01268781841959992,\n \"acc_norm\": 0.44328552803129073,\n\
\ \"acc_norm_stderr\": 0.01268781841959992\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6066176470588235,\n \"acc_stderr\": 0.029674288281311155,\n\
\ \"acc_norm\": 0.6066176470588235,\n \"acc_norm_stderr\": 0.029674288281311155\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6111111111111112,\n \"acc_stderr\": 0.019722058939618068,\n \
\ \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.019722058939618068\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7020408163265306,\n \"acc_stderr\": 0.029279567411065684,\n\
\ \"acc_norm\": 0.7020408163265306,\n \"acc_norm_stderr\": 0.029279567411065684\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.572139303482587,\n\
\ \"acc_stderr\": 0.03498541988407795,\n \"acc_norm\": 0.572139303482587,\n\
\ \"acc_norm_stderr\": 0.03498541988407795\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.040201512610368466,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.040201512610368466\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4879518072289157,\n\
\ \"acc_stderr\": 0.0389136449583582,\n \"acc_norm\": 0.4879518072289157,\n\
\ \"acc_norm_stderr\": 0.0389136449583582\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.02991312723236804,\n\
\ \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.02991312723236804\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4614443084455324,\n\
\ \"mc1_stderr\": 0.017451384104637455,\n \"mc2\": 0.6292738740677482,\n\
\ \"mc2_stderr\": 0.014921407429654188\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7884767166535123,\n \"acc_stderr\": 0.011477747684223188\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.47308567096285065,\n \
\ \"acc_stderr\": 0.013752517189717466\n }\n}\n```"
repo_url: https://huggingface.co/nlpguy/ColorShadow-7B-v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_30T17_39_38.883632
path:
- '**/details_harness|arc:challenge|25_2023-12-30T17-39-38.883632.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-30T17-39-38.883632.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_30T17_39_38.883632
path:
- '**/details_harness|gsm8k|5_2023-12-30T17-39-38.883632.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-30T17-39-38.883632.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_30T17_39_38.883632
path:
- '**/details_harness|hellaswag|10_2023-12-30T17-39-38.883632.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-30T17-39-38.883632.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_30T17_39_38.883632
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T17-39-38.883632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T17-39-38.883632.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T17-39-38.883632.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_30T17_39_38.883632
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T17-39-38.883632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T17-39-38.883632.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_30T17_39_38.883632
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T17-39-38.883632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T17-39-38.883632.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_30T17_39_38.883632
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T17-39-38.883632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T17-39-38.883632.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_30T17_39_38.883632
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T17-39-38.883632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T17-39-38.883632.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_30T17_39_38.883632
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T17-39-38.883632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T17-39-38.883632.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_30T17_39_38.883632
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T17-39-38.883632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T17-39-38.883632.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_30T17_39_38.883632
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T17-39-38.883632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T17-39-38.883632.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_30T17_39_38.883632
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T17-39-38.883632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T17-39-38.883632.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_30T17_39_38.883632
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T17-39-38.883632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T17-39-38.883632.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_30T17_39_38.883632
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T17-39-38.883632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T17-39-38.883632.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_30T17_39_38.883632
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T17-39-38.883632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T17-39-38.883632.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_30T17_39_38.883632
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T17-39-38.883632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T17-39-38.883632.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_30T17_39_38.883632
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T17-39-38.883632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T17-39-38.883632.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_30T17_39_38.883632
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T17-39-38.883632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T17-39-38.883632.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_30T17_39_38.883632
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T17-39-38.883632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T17-39-38.883632.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_30T17_39_38.883632
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T17-39-38.883632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T17-39-38.883632.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_30T17_39_38.883632
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T17-39-38.883632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T17-39-38.883632.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_30T17_39_38.883632
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T17-39-38.883632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T17-39-38.883632.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_30T17_39_38.883632
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T17-39-38.883632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T17-39-38.883632.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_30T17_39_38.883632
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T17-39-38.883632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T17-39-38.883632.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_30T17_39_38.883632
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T17-39-38.883632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T17-39-38.883632.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_30T17_39_38.883632
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T17-39-38.883632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T17-39-38.883632.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_30T17_39_38.883632
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T17-39-38.883632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T17-39-38.883632.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_30T17_39_38.883632
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T17-39-38.883632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T17-39-38.883632.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_30T17_39_38.883632
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T17-39-38.883632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T17-39-38.883632.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_30T17_39_38.883632
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T17-39-38.883632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T17-39-38.883632.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_30T17_39_38.883632
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T17-39-38.883632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T17-39-38.883632.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_30T17_39_38.883632
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T17-39-38.883632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T17-39-38.883632.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_30T17_39_38.883632
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T17-39-38.883632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T17-39-38.883632.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_30T17_39_38.883632
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T17-39-38.883632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T17-39-38.883632.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_30T17_39_38.883632
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T17-39-38.883632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T17-39-38.883632.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_30T17_39_38.883632
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T17-39-38.883632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T17-39-38.883632.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_30T17_39_38.883632
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T17-39-38.883632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T17-39-38.883632.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_30T17_39_38.883632
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T17-39-38.883632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T17-39-38.883632.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_30T17_39_38.883632
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T17-39-38.883632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T17-39-38.883632.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_30T17_39_38.883632
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T17-39-38.883632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T17-39-38.883632.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_30T17_39_38.883632
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T17-39-38.883632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T17-39-38.883632.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_30T17_39_38.883632
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T17-39-38.883632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T17-39-38.883632.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_30T17_39_38.883632
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-30T17-39-38.883632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-30T17-39-38.883632.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_30T17_39_38.883632
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T17-39-38.883632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T17-39-38.883632.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_30T17_39_38.883632
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T17-39-38.883632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T17-39-38.883632.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_30T17_39_38.883632
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T17-39-38.883632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T17-39-38.883632.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_30T17_39_38.883632
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T17-39-38.883632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T17-39-38.883632.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_30T17_39_38.883632
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T17-39-38.883632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T17-39-38.883632.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_30T17_39_38.883632
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T17-39-38.883632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T17-39-38.883632.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_30T17_39_38.883632
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T17-39-38.883632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T17-39-38.883632.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_30T17_39_38.883632
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T17-39-38.883632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T17-39-38.883632.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_30T17_39_38.883632
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T17-39-38.883632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T17-39-38.883632.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_30T17_39_38.883632
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T17-39-38.883632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T17-39-38.883632.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_30T17_39_38.883632
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T17-39-38.883632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T17-39-38.883632.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_30T17_39_38.883632
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T17-39-38.883632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T17-39-38.883632.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_30T17_39_38.883632
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T17-39-38.883632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T17-39-38.883632.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_30T17_39_38.883632
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T17-39-38.883632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T17-39-38.883632.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_30T17_39_38.883632
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T17-39-38.883632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T17-39-38.883632.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_30T17_39_38.883632
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T17-39-38.883632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T17-39-38.883632.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_30T17_39_38.883632
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T17-39-38.883632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T17-39-38.883632.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_30T17_39_38.883632
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T17-39-38.883632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T17-39-38.883632.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_30T17_39_38.883632
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-30T17-39-38.883632.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-30T17-39-38.883632.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_30T17_39_38.883632
path:
- '**/details_harness|winogrande|5_2023-12-30T17-39-38.883632.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-30T17-39-38.883632.parquet'
- config_name: results
data_files:
- split: 2023_12_30T17_39_38.883632
path:
- results_2023-12-30T17-39-38.883632.parquet
- split: latest
path:
- results_2023-12-30T17-39-38.883632.parquet
---
# Dataset Card for Evaluation run of nlpguy/ColorShadow-7B-v2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [nlpguy/ColorShadow-7B-v2](https://huggingface.co/nlpguy/ColorShadow-7B-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_nlpguy__ColorShadow-7B-v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-30T17:39:38.883632](https://huggingface.co/datasets/open-llm-leaderboard/details_nlpguy__ColorShadow-7B-v2/blob/main/results_2023-12-30T17-39-38.883632.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6052838646829277,
"acc_stderr": 0.03314929477593008,
"acc_norm": 0.60872976450195,
"acc_norm_stderr": 0.033818410657125174,
"mc1": 0.4614443084455324,
"mc1_stderr": 0.017451384104637455,
"mc2": 0.6292738740677482,
"mc2_stderr": 0.014921407429654188
},
"harness|arc:challenge|25": {
"acc": 0.6313993174061433,
"acc_stderr": 0.014097810678042198,
"acc_norm": 0.6715017064846417,
"acc_norm_stderr": 0.013724978465537298
},
"harness|hellaswag|10": {
"acc": 0.632742481577375,
"acc_stderr": 0.004810723108378217,
"acc_norm": 0.8469428400716988,
"acc_norm_stderr": 0.003593067633474304
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5481481481481482,
"acc_stderr": 0.04299268905480864,
"acc_norm": 0.5481481481481482,
"acc_norm_stderr": 0.04299268905480864
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.0373852067611967,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.0373852067611967
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6716981132075471,
"acc_stderr": 0.02890159361241178,
"acc_norm": 0.6716981132075471,
"acc_norm_stderr": 0.02890159361241178
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03942082639927213,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03942082639927213
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416907,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416907
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5276595744680851,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.5276595744680851,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.02479606060269995,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.02479606060269995
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6580645161290323,
"acc_stderr": 0.02698528957655273,
"acc_norm": 0.6580645161290323,
"acc_norm_stderr": 0.02698528957655273
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4729064039408867,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.4729064039408867,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.0347769116216366,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.0347769116216366
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7373737373737373,
"acc_stderr": 0.03135305009533086,
"acc_norm": 0.7373737373737373,
"acc_norm_stderr": 0.03135305009533086
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8549222797927462,
"acc_stderr": 0.02541634309630641,
"acc_norm": 0.8549222797927462,
"acc_norm_stderr": 0.02541634309630641
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6153846153846154,
"acc_stderr": 0.02466674491518721,
"acc_norm": 0.6153846153846154,
"acc_norm_stderr": 0.02466674491518721
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.02857834836547308,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.02857834836547308
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6512605042016807,
"acc_stderr": 0.030956636328566545,
"acc_norm": 0.6512605042016807,
"acc_norm_stderr": 0.030956636328566545
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8220183486238533,
"acc_stderr": 0.016399436366612903,
"acc_norm": 0.8220183486238533,
"acc_norm_stderr": 0.016399436366612903
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4675925925925926,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.4675925925925926,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.03058759135160425,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.03058759135160425
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7637130801687764,
"acc_stderr": 0.027652153144159263,
"acc_norm": 0.7637130801687764,
"acc_norm_stderr": 0.027652153144159263
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.732824427480916,
"acc_stderr": 0.03880848301082394,
"acc_norm": 0.732824427480916,
"acc_norm_stderr": 0.03880848301082394
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.03520893951097652,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.03520893951097652
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7239263803680982,
"acc_stderr": 0.035123852837050475,
"acc_norm": 0.7239263803680982,
"acc_norm_stderr": 0.035123852837050475
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8461538461538461,
"acc_stderr": 0.023636873317489288,
"acc_norm": 0.8461538461538461,
"acc_norm_stderr": 0.023636873317489288
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7943805874840357,
"acc_stderr": 0.01445250045678583,
"acc_norm": 0.7943805874840357,
"acc_norm_stderr": 0.01445250045678583
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6676300578034682,
"acc_stderr": 0.02536116874968822,
"acc_norm": 0.6676300578034682,
"acc_norm_stderr": 0.02536116874968822
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.30726256983240224,
"acc_stderr": 0.015430158846469606,
"acc_norm": 0.30726256983240224,
"acc_norm_stderr": 0.015430158846469606
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.026992544339297247,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.026992544339297247
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6784565916398714,
"acc_stderr": 0.026527724079528872,
"acc_norm": 0.6784565916398714,
"acc_norm_stderr": 0.026527724079528872
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.654320987654321,
"acc_stderr": 0.02646248777700187,
"acc_norm": 0.654320987654321,
"acc_norm_stderr": 0.02646248777700187
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.029779450957303062,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.029779450957303062
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44328552803129073,
"acc_stderr": 0.01268781841959992,
"acc_norm": 0.44328552803129073,
"acc_norm_stderr": 0.01268781841959992
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6066176470588235,
"acc_stderr": 0.029674288281311155,
"acc_norm": 0.6066176470588235,
"acc_norm_stderr": 0.029674288281311155
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.019722058939618068,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.019722058939618068
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7020408163265306,
"acc_stderr": 0.029279567411065684,
"acc_norm": 0.7020408163265306,
"acc_norm_stderr": 0.029279567411065684
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.572139303482587,
"acc_stderr": 0.03498541988407795,
"acc_norm": 0.572139303482587,
"acc_norm_stderr": 0.03498541988407795
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.8,
"acc_stderr": 0.040201512610368466,
"acc_norm": 0.8,
"acc_norm_stderr": 0.040201512610368466
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4879518072289157,
"acc_stderr": 0.0389136449583582,
"acc_norm": 0.4879518072289157,
"acc_norm_stderr": 0.0389136449583582
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.02991312723236804,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.02991312723236804
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4614443084455324,
"mc1_stderr": 0.017451384104637455,
"mc2": 0.6292738740677482,
"mc2_stderr": 0.014921407429654188
},
"harness|winogrande|5": {
"acc": 0.7884767166535123,
"acc_stderr": 0.011477747684223188
},
"harness|gsm8k|5": {
"acc": 0.47308567096285065,
"acc_stderr": 0.013752517189717466
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
jxtse/EE-Eval | ---
language:
- en
---
Title: **EE-Eval: A Radio Engineering Domain-Specific Dataset for Language Models Assessment**
Description:
The **EE-Eval** dataset is a comprehensive collection of over 200 questions and answers tailored to the domain of radio engineering. It is meticulously curated to encompass a broad spectrum of topics within the field, including but not limited to signal processing, communication protocols, and wireless network architecture. This dataset is designed to serve as a benchmark for evaluating the capabilities of large language models (LLMs) in understanding and generating responses relevant to the intricacies of radio engineering.
The dataset is divided into two main categories: conceptual questions and computational problems. Conceptual questions assess the model's grasp of theoretical knowledge and principles in radio engineering, while computational problems test its ability to perform mathematical operations and logical reasoning specific to the domain.
The **EE-Eval** dataset is a valuable resource for researchers and developers looking to fine-tune, evaluate, and compare the performance of various language models in the context of radio engineering. It provides a standardized platform for assessing the accuracy and applicability of models when dealing with specialized technical content.
Key Features:
- **Domain-Specific Content**: The dataset is focused on radio engineering, covering a wide range of topics that are essential for professionals and enthusiasts in the field.
- **Balanced Question Types**: A mix of conceptual and computational questions ensures a well-rounded assessment of language models' capabilities.
- **Large Dataset Size**: With over 200 questions and answers, the dataset offers ample material for training and testing purposes.
- **Versatility**: The dataset can be used to evaluate both the comprehension and computational abilities of language models, providing a comprehensive analysis of their performance in the radio engineering domain.
Usage:
The **EE-Eval** dataset can be utilized in various research and development activities, such as:
- Fine-tuning pre-trained language models to improve their performance in radio engineering tasks.
- Benchmarking different models against each other to identify strengths and weaknesses in handling specialized technical content.
- Developing and testing new algorithms and techniques aimed at enhancing the domain-specific understanding of language models.
The dataset is hosted on the Hugging Face platform, making it easily accessible for the research community and ensuring compatibility with Hugging Face's suite of tools and frameworks.
Contribution:
We welcome contributions from the community to enrich the **EE-Eval** dataset further. Whether it's by adding more questions, providing diverse answers, or improving the dataset's documentation, every contribution helps in enhancing the resource's value and impact.
Please note that the **EE-Eval** dataset is provided for non-commercial research and educational purposes. Any commercial use of the dataset should be authorized by the creators and contributors.
We encourage researchers and developers to leverage the **EE-Eval** dataset to advance the state of the art in language models' domain-specific capabilities and contribute to the growth of the radio engineering field. |
laleye/aloresb | ---
task_categories:
- automatic-speech-recognition
tags:
- multilingual
- low resource
pretty_name: alorsb
--- |
julia-lukasiewicz-pater/GPT-wiki-intro-features | ---
license: cc
task_categories:
- text-classification
language:
- en
size_categories:
- 100K<n<1M
---
# Small-GPT-wiki-intro-features dataset
This dataset is based on [aadityaubhat/GPT-wiki-intro](https://huggingface.co/datasets/aadityaubhat/GPT-wiki-intro).
It contains 150k short texts from Wikipedia (label 0) and corresponding texts generated by ChatGPT (label 1) (together 300k texts).
For each text, various complexity measures were calculated, including e.g. readability, lexical diversity etc.
It can be used for text classification or analysis of linguistic features of human-generated and ChatGPT-generated texts.
For a smaller version, check out [julia-lukasiewicz-pater/small-GPT-wiki-intro-features](https://huggingface.co/datasets/julia-lukasiewicz-pater/small-GPT-wiki-intro-features).
## Dataset structure
Features were calculated using various Python libraries, i.e. NLTK, [readability-metrics](https://pypi.org/project/py-readability-metrics/), [lexical-diversity](https://pypi.org/project/lexical-diversity/),
and [TextDescriptives](https://hlasse.github.io/TextDescriptives/). The list of all features and their corresponding sources can be found below:
| Column | Description |
| ------ | ----------- |
| text | human- or ChatGPT-generated text; taken from aadityaubhat/GPT-wiki-intro |
| normalized_bigram_entropy | bigram entropy normalized with estimated maximum entropy; nltk |
| mean_word_length | mean word length; nltk |
| mean_sent_length | mean sentence length; nltk |
| fog | Gunning-Fog; readability-metrics |
| ari | Automated Readability Index; readability-metrics |
| dale_chall | Dale Chall Readability; readability-metrics |
| hdd | Hypergeometric Distribution; lexical-diversity |
| mtld | Measure of lexical textual diversity; lexical-diversity |
| mattr | Moving average type-token ratio; lexical-diversity |
| number_of_ADJ | proportion of adjectives per word; nltk |
| number_of_ADP | proportion of adpositions per word; nltk |
| number_of_ADV | proportion of adverbs per word; nltk |
| number_of_CONJ | proportion of conjunctions per word; nltk |
| number_of_DET | proportion of determiners per word; nltk |
| number_of_NOUN | proportion of nouns per word; nltk |
| number_of_NUM | proportion of numerals per word; nltk |
| number_of_PRT | proportion of particles per word; nltk |
| number_of_PRON | proportion of pronuns per word; nltk |
| number_of_VERB | proportion of verbs per word; nltk |
| number_of_DOT | proportion of punctuation marks per word; nltk |
| number_of_X | proportion of POS tag 'Other' per word; nltk |
| class | binary class, 0 stands for Wikipedia, 1 stands for ChatGPT |
| spacy_perplexity | text perplexity; TextDescriptives |
| entropy | text entropy; TextDescriptives |
| automated_readability_index | Automated Readability Index; TextDescriptives |
| per_word_spacy_perplexity | text perplexity per word; TextDescriptives |
| dependency_distance_mean | mean distance from each token to their dependent; TextDescriptives |
| dependency_distance_std | standard deviation of distance from each token to their dependent; TextDescriptives |
| first_order_coherence | cosine similarity between consecutive sentences; TextDescriptives |
| second_order_coherence | cosine similarity between sentences that are two sentences apart; TextDescriptives |
| smog |SMOG; TextDescriptives |
| prop_adjacent_dependency_relation_mean | mean proportion adjacent dependency relations; TextDescriptives |
| prop_adjacent_dependency_relation_std | standard deviation of proportion adjacent dependency relations; TextDescriptives |
| syllables_per_token_mean | mean of syllables per token; TextDescriptives |
| syllables_per_token_median | median of syllables per token; TextDescriptives |
| token_length_std | standard deviation of token length; TextDescriptives |
| token_length_median | median of token length; TextDescriptives |
| sentence_length_median | median of sentence length; TextDescriptives |
| syllables_per_token_std | standard deviation of syllables per token; TextDescriptives |
| proportion_unique_tokens | proportion of unique tokens; TextDescriptives |
| top_ngram_chr_fraction_3 | fraction of characters in a document which are contained within the top n-grams. For a specified n-gram range; TextDescriptives |
| top_ngram_chr_fraction_2 | fraction of characters in a document which are contained within the top n-grams. For a specified n-gram range; TextDescriptives |
| top_ngram_chr_fraction_4 | fraction of characters in a document which are contained within the top n-grams. For a specified n-gram range; TextDescriptives |
| proportion_bullet_points | fraction of characters in a document which are contained within the top n-grams. For a specified n-gram range; TextDescriptives |
| flesch_reading_ease | Flesch Reading ease ; TextDescriptives |
| flesch_kincaid_grade | Flesch Kincaid grade; TextDescriptives |
| gunning_fog | Gunning-Fog; TextDescriptives |
| coleman_liau_index | Coleman-Liau Index; TextDescriptives |
| oov_ratio| out-of-vocabulary ratio; TextDescriptives |
## Code
Code that was used to generate this dataset can be found on [Github](https://github.com/julia-lukasiewicz-pater/gpt-wiki-features/tree/main). |
tyzhu/find_last_sent_train_100_eval_10 | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: title
dtype: string
- name: context
dtype: string
splits:
- name: train
num_bytes: 266305
num_examples: 210
- name: validation
num_bytes: 10271
num_examples: 10
download_size: 136034
dataset_size: 276576
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
# Dataset Card for "find_last_sent_train_100_eval_10"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
umaru97/flickr30k_train_val_test | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: val
path: data/val-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
- name: caption
list: string
- name: sentids
list: string
- name: split
dtype: string
- name: img_id
dtype: string
- name: filename
dtype: string
splits:
- name: train
num_bytes: 3817535945.6791124
num_examples: 29000
- name: val
num_bytes: 140547184.20822826
num_examples: 1014
- name: test
num_bytes: 142117238.54065907
num_examples: 1000
download_size: 4305964964
dataset_size: 4100200368.4279995
---
# Dataset Card for "flickr30k_train_val_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-eval-ARTeLab__ilpost-ARTeLab__ilpost-d2ea00-1904764775 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- ARTeLab/ilpost
eval_info:
task: summarization
model: ARTeLab/it5-summarization-ilpost
metrics: ['bertscore']
dataset_name: ARTeLab/ilpost
dataset_config: ARTeLab--ilpost
dataset_split: test
col_mapping:
text: source
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: ARTeLab/it5-summarization-ilpost
* Dataset: ARTeLab/ilpost
* Config: ARTeLab--ilpost
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@morenolq](https://huggingface.co/morenolq) for evaluating this model. |
mairicreedon/288-fire-severity-demo | ---
license: pddl
---
|
Ruanitto/NengueCunamata | ---
license: openrail
---
|
krishnareddy/icddxdescmap | ---
license: apache-2.0
language:
- en
tags:
- CAC
- ICD10
pretty_name: ICD10 DX Code Description
---
# ICD10 Diagnosis Description Mapping Dataset
## Overview
This dataset is designed to assist in mapping ICD10 Diagnosis descriptions documented in clinical documents to the standard ICD10 Diagnosis descriptions by CMS (Centers for Medicare & Medicaid Services). The primary objective is to train a model that can map free-form disease text to ICD Codes.
## Dataset Details
The dataset consists of the following columns:
- **AnnotationString**: This column contains the disease text as described in patient clinical documents.
- **DXCode**: This column contains the standard DX code corresponding to the description in the AnnotationString column.
- **ShortDesc**: This column contains the standard ICD Description in a short form corresponding to the DXCode.
- **LongDesc**: This column contains the standard ICD Description in a long form corresponding to the DXCode.
## Model Training Objective
The goal of this dataset is to train the LLAMA2 7B Model to generate one of the following outputs based on the provided AnnotationString as input:
### Case 1: Expecting DX Code as Output
- **Instruction**: As an ICD Code helper, the model needs to generate an ICD 10 DX Code for the corresponding disease description.
- **Input**: `<Annotation String>`
- **Output**: `<DXCode>`
### Case 2: Expecting ShortDesc as Output
- **Instruction**: As an ICD Code helper, the model needs to generate the standard ICD 10 DX Code Short Description for the corresponding clinical text.
- **Input**: `<Annotation String>`
- **Output**: `<ShortDesc>`
### Case 3: Expecting LongDesc as Output
- **Instruction**: As an ICD Code helper, the model needs to generate the standard ICD 10 DX Code Long Description for the corresponding clinical text.
- **Input**: `<Annotation String>`
- **Output**: `<LongDesc>`
## Usage
To utilize this dataset effectively for training the LLAMA2 7B Model, you can follow these steps:
1. Preprocess your input data to match the format specified in the "Instruction" for the desired case (1, 2, or 3).
2. Use the preprocessed data as input to your model.
3. The model will generate the corresponding output (DXCode, ShortDesc, or LongDesc) based on the case you specified.
## Example
Here's an example of how to use this dataset and the model:
### Input
```python
# Instruction: As a ICD Code helper, you need to generate ICD 10 Dx Code for corresponding following Disease description
# Input: "Patient presents with persistent cough and fever."
```
### Output (Case 1)
```python
# Output: "J44.9"
```
### Output (Case 2)
```python
# Output: "Chronic obstructive pulmonary disease, unspecified"
```
### Output (Case 3)
```python
# Output: "Chronic obstructive pulmonary disease, unspecified"
```
**Warning: Potential Mapping Errors**
This dataset has been harnessed from another rule-based NLP application, and it's important to be aware of the possibility of errors in the mapping of clinical text to standard codes or standard descriptions. While every effort has been made to ensure accuracy, there are inherent challenges in automated mapping processes, especially when dealing with complex medical terminology and variations in clinical documentation.
Users should exercise caution when relying on the generated results and consider the following:
1. **Clinical Variation:** Clinical documentation can vary significantly in style and terminology, which may lead to inaccuracies in mapping.
2. **Complex Cases:** Some medical conditions or descriptions may not have a one-to-one mapping to standard codes or descriptions, resulting in potential discrepancies.
3. **Data Quality:** The accuracy of the mapping is also dependent on the quality and consistency of the input clinical text.
4. **Continuous Improvement:** Efforts are ongoing to improve the accuracy of the mapping, and users are encouraged to provide feedback and report any discrepancies.
It is advisable to cross-verify the results generated by this dataset, especially in critical healthcare applications where accuracy is paramount.
Please use this dataset responsibly and in conjunction with clinical expertise to make informed decisions.
If you encounter any issues or have suggestions for improvement, please reach out to the dataset maintainers.
Thank you for your understanding and cooperation.
## Citation
If you use this dataset or the pre-trained model in your research or applications, please consider citing the source or authors for proper attribution.
---
Feel free to modify and expand this readme file to include any additional information or instructions specific to your use case. |
snowfly/processed_demo | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 50
num_examples: 2
- name: validation
num_bytes: 50
num_examples: 2
- name: test
num_bytes: 50
num_examples: 2
download_size: 3987
dataset_size: 150
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
huggingface/autotrain-data-julsimon-sentiment-analysis-movies | Invalid username or password. |
zolak/twitter_dataset_79_1713228553 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 193294
num_examples: 485
download_size: 103462
dataset_size: 193294
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.