datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
Bengt0/Toads_and_Frogs_Datasets_Anuran | ---
license: cc0-1.0
pretty_name: Toads and Frogs Datasets (Anuran)
size_categories:
- 1K<n<10K
---
Source: https://www.kaggle.com/datasets/erhanakbal/toads-and-frogs-datasets-anuran |
open-llm-leaderboard/details_Yuma42__KangalKhan-PrimordialSapphire-7B | ---
pretty_name: Evaluation run of Yuma42/KangalKhan-PrimordialSapphire-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Yuma42/KangalKhan-PrimordialSapphire-7B](https://huggingface.co/Yuma42/KangalKhan-PrimordialSapphire-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Yuma42__KangalKhan-PrimordialSapphire-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-22T15:19:16.930430](https://huggingface.co/datasets/open-llm-leaderboard/details_Yuma42__KangalKhan-PrimordialSapphire-7B/blob/main/results_2024-02-22T15-19-16.930430.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6338756874368878,\n\
\ \"acc_stderr\": 0.032192402187293655,\n \"acc_norm\": 0.6353309372700157,\n\
\ \"acc_norm_stderr\": 0.032835260851068776,\n \"mc1\": 0.4039167686658507,\n\
\ \"mc1_stderr\": 0.01717727682258428,\n \"mc2\": 0.5725359946395934,\n\
\ \"mc2_stderr\": 0.015463705718675662\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.628839590443686,\n \"acc_stderr\": 0.01411797190114282,\n\
\ \"acc_norm\": 0.658703071672355,\n \"acc_norm_stderr\": 0.013855831287497728\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6728739294961164,\n\
\ \"acc_stderr\": 0.0046820489066223174,\n \"acc_norm\": 0.8551085441147181,\n\
\ \"acc_norm_stderr\": 0.003512719952354537\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.02863723563980089,\n\
\ \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.02863723563980089\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"\
acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6011560693641619,\n\
\ \"acc_stderr\": 0.037336266553835096,\n \"acc_norm\": 0.6011560693641619,\n\
\ \"acc_norm_stderr\": 0.037336266553835096\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.048108401480826346,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.048108401480826346\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108101,\n\
\ \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108101\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n\
\ \"acc_stderr\": 0.046854730419077895,\n \"acc_norm\": 0.45614035087719296,\n\
\ \"acc_norm_stderr\": 0.046854730419077895\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728763,\n\
\ \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728763\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42592592592592593,\n \"acc_stderr\": 0.025467149045469546,\n \"\
acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.025467149045469546\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.044444444444444495,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.044444444444444495\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n\
\ \"acc_stderr\": 0.02354079935872328,\n \"acc_norm\": 0.7806451612903226,\n\
\ \"acc_norm_stderr\": 0.02354079935872328\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.02912652283458682,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.02912652283458682\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.023381935348121434,\n\
\ \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.023381935348121434\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6102564102564103,\n \"acc_stderr\": 0.024726967886647074,\n\
\ \"acc_norm\": 0.6102564102564103,\n \"acc_norm_stderr\": 0.024726967886647074\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3111111111111111,\n \"acc_stderr\": 0.02822644674968352,\n \
\ \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.02822644674968352\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \
\ \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8348623853211009,\n \"acc_stderr\": 0.015919557829976037,\n \"\
acc_norm\": 0.8348623853211009,\n \"acc_norm_stderr\": 0.015919557829976037\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588667,\n \"\
acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588667\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290913,\n \
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290913\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n\
\ \"acc_stderr\": 0.030769352008229136,\n \"acc_norm\": 0.6995515695067265,\n\
\ \"acc_norm_stderr\": 0.030769352008229136\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n\
\ \"acc_stderr\": 0.022209309073165616,\n \"acc_norm\": 0.8675213675213675,\n\
\ \"acc_norm_stderr\": 0.022209309073165616\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n\
\ \"acc_stderr\": 0.013625556907993457,\n \"acc_norm\": 0.8237547892720306,\n\
\ \"acc_norm_stderr\": 0.013625556907993457\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7109826589595376,\n \"acc_stderr\": 0.02440517393578323,\n\
\ \"acc_norm\": 0.7109826589595376,\n \"acc_norm_stderr\": 0.02440517393578323\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.32737430167597764,\n\
\ \"acc_stderr\": 0.015694238967737383,\n \"acc_norm\": 0.32737430167597764,\n\
\ \"acc_norm_stderr\": 0.015694238967737383\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.025261691219729474,\n\
\ \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.025261691219729474\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\
\ \"acc_stderr\": 0.026003301117885135,\n \"acc_norm\": 0.7009646302250804,\n\
\ \"acc_norm_stderr\": 0.026003301117885135\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135118,\n\
\ \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135118\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5035460992907801,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.5035460992907801,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46936114732724904,\n\
\ \"acc_stderr\": 0.012746237711716634,\n \"acc_norm\": 0.46936114732724904,\n\
\ \"acc_norm_stderr\": 0.012746237711716634\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.028418208619406762,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.028418208619406762\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6699346405228758,\n \"acc_stderr\": 0.019023726160724553,\n \
\ \"acc_norm\": 0.6699346405228758,\n \"acc_norm_stderr\": 0.019023726160724553\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7510204081632653,\n \"acc_stderr\": 0.027682979522960227,\n\
\ \"acc_norm\": 0.7510204081632653,\n \"acc_norm_stderr\": 0.027682979522960227\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8208955223880597,\n\
\ \"acc_stderr\": 0.027113286753111837,\n \"acc_norm\": 0.8208955223880597,\n\
\ \"acc_norm_stderr\": 0.027113286753111837\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4039167686658507,\n\
\ \"mc1_stderr\": 0.01717727682258428,\n \"mc2\": 0.5725359946395934,\n\
\ \"mc2_stderr\": 0.015463705718675662\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7821625887924231,\n \"acc_stderr\": 0.011601066079939324\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6118271417740713,\n \
\ \"acc_stderr\": 0.013423607564002757\n }\n}\n```"
repo_url: https://huggingface.co/Yuma42/KangalKhan-PrimordialSapphire-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_22T15_19_16.930430
path:
- '**/details_harness|arc:challenge|25_2024-02-22T15-19-16.930430.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-22T15-19-16.930430.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_22T15_19_16.930430
path:
- '**/details_harness|gsm8k|5_2024-02-22T15-19-16.930430.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-22T15-19-16.930430.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_22T15_19_16.930430
path:
- '**/details_harness|hellaswag|10_2024-02-22T15-19-16.930430.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-22T15-19-16.930430.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_22T15_19_16.930430
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-22T15-19-16.930430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-22T15-19-16.930430.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-22T15-19-16.930430.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_22T15_19_16.930430
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-22T15-19-16.930430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-22T15-19-16.930430.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_22T15_19_16.930430
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-22T15-19-16.930430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-22T15-19-16.930430.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_22T15_19_16.930430
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-22T15-19-16.930430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-22T15-19-16.930430.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_22T15_19_16.930430
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-22T15-19-16.930430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-22T15-19-16.930430.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_22T15_19_16.930430
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-22T15-19-16.930430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-22T15-19-16.930430.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_22T15_19_16.930430
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-22T15-19-16.930430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-22T15-19-16.930430.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_22T15_19_16.930430
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-22T15-19-16.930430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-22T15-19-16.930430.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_22T15_19_16.930430
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-22T15-19-16.930430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-22T15-19-16.930430.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_22T15_19_16.930430
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-22T15-19-16.930430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-22T15-19-16.930430.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_22T15_19_16.930430
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-22T15-19-16.930430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-22T15-19-16.930430.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_22T15_19_16.930430
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-22T15-19-16.930430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-22T15-19-16.930430.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_22T15_19_16.930430
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-22T15-19-16.930430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-22T15-19-16.930430.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_22T15_19_16.930430
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-22T15-19-16.930430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-22T15-19-16.930430.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_22T15_19_16.930430
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-22T15-19-16.930430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-22T15-19-16.930430.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_22T15_19_16.930430
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-22T15-19-16.930430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-22T15-19-16.930430.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_22T15_19_16.930430
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-22T15-19-16.930430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-22T15-19-16.930430.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_22T15_19_16.930430
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-22T15-19-16.930430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-22T15-19-16.930430.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_22T15_19_16.930430
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-22T15-19-16.930430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-22T15-19-16.930430.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_22T15_19_16.930430
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-22T15-19-16.930430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-22T15-19-16.930430.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_22T15_19_16.930430
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-22T15-19-16.930430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-22T15-19-16.930430.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_22T15_19_16.930430
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-22T15-19-16.930430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-22T15-19-16.930430.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_22T15_19_16.930430
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-22T15-19-16.930430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-22T15-19-16.930430.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_22T15_19_16.930430
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-22T15-19-16.930430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-22T15-19-16.930430.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_22T15_19_16.930430
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-22T15-19-16.930430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-22T15-19-16.930430.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_22T15_19_16.930430
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-22T15-19-16.930430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-22T15-19-16.930430.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_22T15_19_16.930430
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-22T15-19-16.930430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-22T15-19-16.930430.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_22T15_19_16.930430
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-22T15-19-16.930430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-22T15-19-16.930430.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_22T15_19_16.930430
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-22T15-19-16.930430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-22T15-19-16.930430.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_22T15_19_16.930430
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-22T15-19-16.930430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-22T15-19-16.930430.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_22T15_19_16.930430
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-22T15-19-16.930430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-22T15-19-16.930430.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_22T15_19_16.930430
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-22T15-19-16.930430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-22T15-19-16.930430.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_22T15_19_16.930430
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-22T15-19-16.930430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-22T15-19-16.930430.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_22T15_19_16.930430
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-22T15-19-16.930430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-22T15-19-16.930430.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_22T15_19_16.930430
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-22T15-19-16.930430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-22T15-19-16.930430.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_22T15_19_16.930430
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-22T15-19-16.930430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-22T15-19-16.930430.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_22T15_19_16.930430
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-22T15-19-16.930430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-22T15-19-16.930430.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_22T15_19_16.930430
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-22T15-19-16.930430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-22T15-19-16.930430.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_22T15_19_16.930430
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-22T15-19-16.930430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-22T15-19-16.930430.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_22T15_19_16.930430
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-22T15-19-16.930430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-22T15-19-16.930430.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_22T15_19_16.930430
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-22T15-19-16.930430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-22T15-19-16.930430.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_22T15_19_16.930430
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-22T15-19-16.930430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-22T15-19-16.930430.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_22T15_19_16.930430
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-22T15-19-16.930430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-22T15-19-16.930430.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_22T15_19_16.930430
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-22T15-19-16.930430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-22T15-19-16.930430.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_22T15_19_16.930430
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-22T15-19-16.930430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-22T15-19-16.930430.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_22T15_19_16.930430
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-22T15-19-16.930430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-22T15-19-16.930430.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_22T15_19_16.930430
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-22T15-19-16.930430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-22T15-19-16.930430.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_22T15_19_16.930430
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-22T15-19-16.930430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-22T15-19-16.930430.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_22T15_19_16.930430
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-22T15-19-16.930430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-22T15-19-16.930430.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_22T15_19_16.930430
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-22T15-19-16.930430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-22T15-19-16.930430.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_22T15_19_16.930430
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-22T15-19-16.930430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-22T15-19-16.930430.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_22T15_19_16.930430
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-22T15-19-16.930430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-22T15-19-16.930430.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_22T15_19_16.930430
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-22T15-19-16.930430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-22T15-19-16.930430.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_22T15_19_16.930430
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-22T15-19-16.930430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-22T15-19-16.930430.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_22T15_19_16.930430
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-22T15-19-16.930430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-22T15-19-16.930430.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_22T15_19_16.930430
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-22T15-19-16.930430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-22T15-19-16.930430.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_22T15_19_16.930430
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-22T15-19-16.930430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-22T15-19-16.930430.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_22T15_19_16.930430
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-22T15-19-16.930430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-22T15-19-16.930430.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_22T15_19_16.930430
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-22T15-19-16.930430.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-22T15-19-16.930430.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_22T15_19_16.930430
path:
- '**/details_harness|winogrande|5_2024-02-22T15-19-16.930430.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-22T15-19-16.930430.parquet'
- config_name: results
data_files:
- split: 2024_02_22T15_19_16.930430
path:
- results_2024-02-22T15-19-16.930430.parquet
- split: latest
path:
- results_2024-02-22T15-19-16.930430.parquet
---
# Dataset Card for Evaluation run of Yuma42/KangalKhan-PrimordialSapphire-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Yuma42/KangalKhan-PrimordialSapphire-7B](https://huggingface.co/Yuma42/KangalKhan-PrimordialSapphire-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Yuma42__KangalKhan-PrimordialSapphire-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-22T15:19:16.930430](https://huggingface.co/datasets/open-llm-leaderboard/details_Yuma42__KangalKhan-PrimordialSapphire-7B/blob/main/results_2024-02-22T15-19-16.930430.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6338756874368878,
"acc_stderr": 0.032192402187293655,
"acc_norm": 0.6353309372700157,
"acc_norm_stderr": 0.032835260851068776,
"mc1": 0.4039167686658507,
"mc1_stderr": 0.01717727682258428,
"mc2": 0.5725359946395934,
"mc2_stderr": 0.015463705718675662
},
"harness|arc:challenge|25": {
"acc": 0.628839590443686,
"acc_stderr": 0.01411797190114282,
"acc_norm": 0.658703071672355,
"acc_norm_stderr": 0.013855831287497728
},
"harness|hellaswag|10": {
"acc": 0.6728739294961164,
"acc_stderr": 0.0046820489066223174,
"acc_norm": 0.8551085441147181,
"acc_norm_stderr": 0.003512719952354537
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.02863723563980089,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.02863723563980089
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6011560693641619,
"acc_stderr": 0.037336266553835096,
"acc_norm": 0.6011560693641619,
"acc_norm_stderr": 0.037336266553835096
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.048108401480826346,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.048108401480826346
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108101,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108101
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.046854730419077895,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.046854730419077895
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.025467149045469546,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.025467149045469546
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.044444444444444495,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.044444444444444495
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.02354079935872328,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.02354079935872328
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.02912652283458682,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.02912652283458682
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.023381935348121434,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.023381935348121434
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6102564102564103,
"acc_stderr": 0.024726967886647074,
"acc_norm": 0.6102564102564103,
"acc_norm_stderr": 0.024726967886647074
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.02822644674968352,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.02822644674968352
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255169,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255169
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8348623853211009,
"acc_stderr": 0.015919557829976037,
"acc_norm": 0.8348623853211009,
"acc_norm_stderr": 0.015919557829976037
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588667,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588667
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290913,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290913
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.030769352008229136,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.030769352008229136
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165616,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165616
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993457,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993457
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7109826589595376,
"acc_stderr": 0.02440517393578323,
"acc_norm": 0.7109826589595376,
"acc_norm_stderr": 0.02440517393578323
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.32737430167597764,
"acc_stderr": 0.015694238967737383,
"acc_norm": 0.32737430167597764,
"acc_norm_stderr": 0.015694238967737383
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.025261691219729474,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.025261691219729474
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.026003301117885135,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.026003301117885135
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135118,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.024477222856135118
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5035460992907801,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.5035460992907801,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46936114732724904,
"acc_stderr": 0.012746237711716634,
"acc_norm": 0.46936114732724904,
"acc_norm_stderr": 0.012746237711716634
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.028418208619406762,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.028418208619406762
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6699346405228758,
"acc_stderr": 0.019023726160724553,
"acc_norm": 0.6699346405228758,
"acc_norm_stderr": 0.019023726160724553
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7510204081632653,
"acc_stderr": 0.027682979522960227,
"acc_norm": 0.7510204081632653,
"acc_norm_stderr": 0.027682979522960227
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8208955223880597,
"acc_stderr": 0.027113286753111837,
"acc_norm": 0.8208955223880597,
"acc_norm_stderr": 0.027113286753111837
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4039167686658507,
"mc1_stderr": 0.01717727682258428,
"mc2": 0.5725359946395934,
"mc2_stderr": 0.015463705718675662
},
"harness|winogrande|5": {
"acc": 0.7821625887924231,
"acc_stderr": 0.011601066079939324
},
"harness|gsm8k|5": {
"acc": 0.6118271417740713,
"acc_stderr": 0.013423607564002757
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
MuskumPillerum/General-Knowledge | ---
license: mit
task_categories:
- text-generation
- text2text-generation
language:
- en
tags:
- general knowledge
- GK
- reasoning
- facts
- alpaca
pretty_name: General knowledge dataset
size_categories:
- 10K<n<100K
---
# Dataset Card for Dataset Name
### Dataset Summary
The dataset is a collection of questions and answers themed on general facts and reasoning. The dataset is divided into two features - 'Question' and 'Answer'.
It is meant to be used for training a model to be good at general knowledge and reasoning. This dataset is inspired from the Alpaca dataset, and infact contains a subset of the alpaca dataset in itself.
### Distribution
The distribution of the MuskumPillerum/General-Knowledge dataset is:
```
Total (non alpaca): 6315
- Facts - 80.8 %
- Nature - 16.5 %
- AI, Computer science, Robotics - 7.3 %
- Physics, Chemistry - 16.3 %
- Geography, History - 11.2 %
- People - 16 %
- Sports - 13.5 %
- Recommendation, Reasoning, Dilemma - 17.8 %
- Others - 1.4 %
```
### Format
```
{'Question': 'What is the largest species of shark',
'Answer': 'The whale shark is considered the largest species of shark, with adults reaching lengths of up to 40 feet or more and weighing several tons.'}
```
### Languages
English
### Source Data
This dataset is inspired from Stanfords alpaca dataset: tatsu-lab/alpaca
```
@misc{alpaca,
author = {Rohan Taori and Ishaan Gulrajani and Tianyi Zhang and Yann Dubois and Xuechen Li and Carlos Guestrin and Percy Liang and Tatsunori B. Hashimoto },
title = {Stanford Alpaca: An Instruction-following LLaMA model},
year = {2023},
publisher = {GitHub},
journal = {GitHub repository},
howpublished = {\url{https://github.com/tatsu-lab/stanford_alpaca}},
}
```
### Licensing Information
This uses MIT licence
### Citation Information
Right now, just refer: MuskumPillerum/General-Knowledge
|
FINNUMBER/FINCH_TRAIN_EXT_NEWFORMAT | ---
dataset_info:
features:
- name: task
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
- name: instruction
dtype: string
- name: output
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 42738064
num_examples: 11059
download_size: 23708381
dataset_size: 42738064
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ajaykarthick/imdb-movie-reviews | ---
task_categories:
- text-classification
- token-classification
- feature-extraction
pretty_name: Movie-Reviews
size_categories:
- 10K<n<100K
---
# IMDB Movie Reviews

This is a dataset for binary sentiment classification containing substantially huge data. This dataset contains a set of 50,000 highly polar movie reviews for training models for text classification tasks.
The dataset is downloaded from
https://ai.stanford.edu/~amaas/data/sentiment/aclImdb_v1.tar.gz
This data is processed and splitted into training and test datasets (0.2% test split). Training dataset contains 40000 reviews and test dataset contains 10000 reviews.
Equal distribution among the labels in both training and test dataset. in training dataset, there are 20000 records for both positive and negative classes. In test dataset, there are 5000 records both the labels.
### Citation Information
```
@InProceedings{maas-EtAl:2011:ACL-HLT2011,
author = {Maas, Andrew L. and Daly, Raymond E. and Pham, Peter T. and Huang, Dan and Ng, Andrew Y. and Potts, Christopher},
title = {Learning Word Vectors for Sentiment Analysis},
booktitle = {Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics: Human Language Technologies},
month = {June},
year = {2011},
address = {Portland, Oregon, USA},
publisher = {Association for Computational Linguistics},
pages = {142--150},
url = {http://www.aclweb.org/anthology/P11-1015}
}
``` |
petrpan26/typescript-jest | ---
dataset_info:
features:
- name: level_0
dtype: int64
- name: index
dtype: int64
- name: repo_id
dtype: string
- name: file_path
dtype: string
- name: content
dtype: string
splits:
- name: train
num_bytes: 564108784
num_examples: 11324
download_size: 199094377
dataset_size: 564108784
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
autoevaluate/autoeval-staging-eval-project-9e17c416-43f7-4fe8-b337-f391ae065c4a-6963 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- conll2003
eval_info:
task: entity_extraction
model: autoevaluate/entity-extraction
metrics: []
dataset_name: conll2003
dataset_config: conll2003
dataset_split: test
col_mapping:
tokens: tokens
tags: ner_tags
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Token Classification
* Model: autoevaluate/entity-extraction
* Dataset: conll2003
* Config: conll2003
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
adamwatters/multijam-avatar | ---
license: openrail
---
|
LangChainDatasets/agent-search-calculator | ---
license: mit
---
|
magicsword/train-en-zh | ---
license: openrail
task_categories:
- translation
language:
- en
- zh
tags:
- not-for-all-audiences
pretty_name: first time train
size_categories:
- 100K<n<1M
--- |
gx-ai-architect/helpsteer_preference | ---
dataset_info:
features:
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
- name: prompt
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 364976952
num_examples: 33891
- name: test
num_bytes: 18829968
num_examples: 1727
download_size: 61389517
dataset_size: 383806920
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
yuan-sf63/chenyu_label_0.8_16 | ---
dataset_info:
features:
- name: text
dtype: string
- name: '0'
dtype: int64
- name: '1'
dtype: int64
- name: '2'
dtype: int64
- name: '3'
dtype: int64
- name: '4'
dtype: int64
- name: '5'
dtype: int64
- name: '6'
dtype: int64
- name: '7'
dtype: int64
- name: '8'
dtype: int64
- name: '9'
dtype: int64
- name: '10'
dtype: int64
- name: '11'
dtype: int64
- name: '12'
dtype: int64
- name: '13'
dtype: int64
- name: '14'
dtype: int64
- name: '15'
dtype: int64
splits:
- name: train
num_bytes: 6963334.680874696
num_examples: 38893
- name: validation
num_bytes: 773803.3191253038
num_examples: 4322
download_size: 0
dataset_size: 7737138.0
---
# Dataset Card for "chenyu_label_0.8_16"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DynamicSuperb/MultiSpeakerDetection_LibriSpeech-TestClean | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
dataset_info:
features:
- name: file
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: instruction
dtype: string
- name: label
dtype: string
- name: utterance 1
dtype: string
- name: utterance 2
dtype: string
- name: utterance 3
dtype: string
- name: utterance 4
dtype: string
- name: utterance 5
dtype: string
splits:
- name: test
num_bytes: 71493752.4
num_examples: 200
download_size: 67905455
dataset_size: 71493752.4
---
# Dataset Card for "MultiSpeakerDetection_LibriSpeechTestClean"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Henrychur/MMedBench | ---
license: cc-by-4.0
language:
- en
- zh
- ja
- fr
- ru
- es
tags:
- medical
task_categories:
- question-answering
---
# MMedBench
[💻Github Repo](https://github.com/MAGIC-AI4Med/MMedLM) [🖨️arXiv Paper](https://arxiv.org/abs/2402.13963)
The official benchmark for "Towards Building Multilingual Language Model for Medicine".
## Introduction
This repo contains MMedBench, a comprehensive multilingual medical benchmark comprising 45,048 QA pairs for training and 8,518 QA pairs for testing. Each sample includes a question, options, the correct answer, and a reference explanation for the selection of the correct answer.
To access the data, please download MMedBench.zip. Upon extracting the file, you will find two folders named Train and Test. Each folder contains six .jsonl files, each named after its respective language. Each line in these files represents a sample, with the following attributes for each sample:
|Key |Value Type |Description |
|------------------|-------------------|-----------------------------------------|
|question |String | A string of question |
|options |Dict | A dict where key is the index ‘A,B,C,D,E’ and value is the string of option| |
|answer_idx |String | A string of right answer idxs. Each idx is split by ','|
|rationale |String | A string of explanation for the selection of the correct answer |
|human_checked |Bool | Whether the rationale has been manually checked. |
|human_check_passed |Bool | Whether the rationale has passed manual check. |
Our [GitHub](https://github.com/MAGIC-AI4Med/MMedLM) provides the code for finetuning on the trainset of MMedBench. Check out for more details.
## News
[2024.2.21] Our pre-print paper is released ArXiv. Dive into our findings [here](https://arxiv.org/abs/2402.13963).
[2024.2.20] We release [MMedLM](https://huggingface.co/Henrychur/MMedLM) and [MMedLM 2](https://huggingface.co/Henrychur/MMedLM2). With an auto-regressive continues training on MMedC, these models achieves superior performance compared to all other open-source models, even rivaling GPT-4 on MMedBench.
[2023.2.20] We release [MMedC](https://huggingface.co/datasets/Henrychur/MMedC), a multilingual medical corpus containing 25.5B tokens.
[2023.2.20] We release [MMedBench](https://huggingface.co/datasets/Henrychur/MMedBench), a new multilingual medical multi-choice question-answering
benchmark with rationale. Check out the leaderboard [here](https://henrychur.github.io/MultilingualMedQA/).
## Evaluation on MMedBench
The further pretrained MMedLM 2 showcast it's great performance in medical domain across different language.
| Method | Size | Year | MMedC | MMedBench | English | Chinese | Japanese | French | Russian | Spanish | Avg. |
|------------------|------|---------|-----------|-----------|----------------|----------------|----------------|----------------|----------------|----------------|----------------|
| GPT-3.5 | - | 2022.12 | ✗ | ✗ | 56.88 | 52.29 | 34.63 | 32.48 | 66.36 | 66.06 | 51.47 |
| GPT-4 | - | 2023.3 | ✗ | ✗ | 78.00 | 75.07 | 72.91 | 56.59 | 83.62 | 85.67 | 74.27 |
| Gemini-1.0 pro | - | 2024.1 | ✗ | ✗ | 53.73 | 60.19 | 44.22 | 29.90 | 73.44 | 69.69 | 55.20 |
| BLOOMZ | 7B | 2023.5 | ✗ | trainset | 43.28 | 58.06 | 32.66 | 26.37 | 62.89 | 47.34 | 45.10 |
| InternLM | 7B | 2023.7 | ✗ | trainset | 44.07 | 64.62 | 37.19 | 24.92 | 58.20 | 44.97 | 45.67 |
| Llama\ 2 | 7B | 2023.7 | ✗ | trainset | 43.36 | 50.29 | 25.13 | 20.90 | 66.80 | 47.10 | 42.26 |
| MedAlpaca | 7B | 2023.3 | ✗ | trainset | 46.74 | 44.80 | 29.64 | 21.06 | 59.38 | 45.00 | 41.11 |
| ChatDoctor | 7B | 2023.4 | ✗ | trainset | 43.52 | 43.26 | 25.63 | 18.81 | 62.50 | 43.44 | 39.53 |
| PMC-LLaMA | 7B | 2023.4 | ✗ | trainset | 47.53 | 42.44 | 24.12 | 20.74 | 62.11 | 43.29 | 40.04 |
| Mistral | 7B | 2023.10 | ✗ | trainset | 61.74 | 71.10 | 44.72 | 48.71 | 74.22 | 63.86 | 60.73 |
| InternLM\ 2 | 7B | 2024.2 | ✗ | trainset | 57.27 | 77.55 | 47.74 | 41.00 | 68.36 | 59.59 | 58.59 |
| MMedLM~(Ours) | 7B | - | ✗ | trainset | 49.88 | 70.49 | 46.23 | 36.66 | 72.27 | 54.52 | 55.01 |
| MMedLM\ 2~(Ours) | 7B | - | ✗ | trainset | 61.74 | 80.01 | 61.81 | 52.09 | 80.47 | 67.65 | 67.30 |
- GPT and Gemini is evluated under zero-shot setting through API
- Open-source models first undergo training on the trainset of MMedBench before evaluate.
## Contact
If you have any question, please feel free to contact qiupengcheng@pjlab.org.cn.
## Citation
```
@misc{qiu2024building,
title={Towards Building Multilingual Language Model for Medicine},
author={Pengcheng Qiu and Chaoyi Wu and Xiaoman Zhang and Weixiong Lin and Haicheng Wang and Ya Zhang and Yanfeng Wang and Weidi Xie},
year={2024},
eprint={2402.13963},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` |
open-llm-leaderboard/details_kodonho__Solar-OrcaDPO-Solar-Instruct-SLERP | ---
pretty_name: Evaluation run of kodonho/Solar-OrcaDPO-Solar-Instruct-SLERP
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [kodonho/Solar-OrcaDPO-Solar-Instruct-SLERP](https://huggingface.co/kodonho/Solar-OrcaDPO-Solar-Instruct-SLERP)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_kodonho__Solar-OrcaDPO-Solar-Instruct-SLERP\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-13T17:32:35.779900](https://huggingface.co/datasets/open-llm-leaderboard/details_kodonho__Solar-OrcaDPO-Solar-Instruct-SLERP/blob/main/results_2024-01-13T17-32-35.779900.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6659359197324706,\n\
\ \"acc_stderr\": 0.03167249441105516,\n \"acc_norm\": 0.6667077779566729,\n\
\ \"acc_norm_stderr\": 0.032318448519432046,\n \"mc1\": 0.5679314565483476,\n\
\ \"mc1_stderr\": 0.017341202394988327,\n \"mc2\": 0.7195437123974021,\n\
\ \"mc2_stderr\": 0.01500878766115849\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6808873720136519,\n \"acc_stderr\": 0.013621696119173306,\n\
\ \"acc_norm\": 0.7098976109215017,\n \"acc_norm_stderr\": 0.013261573677520766\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7105158334993029,\n\
\ \"acc_stderr\": 0.004525960965551707,\n \"acc_norm\": 0.882194781915953,\n\
\ \"acc_norm_stderr\": 0.003217184906847944\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7302631578947368,\n \"acc_stderr\": 0.03611780560284898,\n\
\ \"acc_norm\": 0.7302631578947368,\n \"acc_norm_stderr\": 0.03611780560284898\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.74,\n\
\ \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.74,\n \
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.02863723563980089,\n\
\ \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.02863723563980089\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n\
\ \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.7569444444444444,\n\
\ \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n\
\ \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \
\ \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247077,\n\
\ \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247077\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n\
\ \"acc_stderr\": 0.04810840148082636,\n \"acc_norm\": 0.37254901960784315,\n\
\ \"acc_norm_stderr\": 0.04810840148082636\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816507,\n \
\ \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816507\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6340425531914894,\n\
\ \"acc_stderr\": 0.031489558297455304,\n \"acc_norm\": 0.6340425531914894,\n\
\ \"acc_norm_stderr\": 0.031489558297455304\n },\n \"harness|hendrycksTest-econometrics|5\"\
: {\n \"acc\": 0.5087719298245614,\n \"acc_stderr\": 0.04702880432049615,\n\
\ \"acc_norm\": 0.5087719298245614,\n \"acc_norm_stderr\": 0.04702880432049615\n\
\ },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\"\
: 0.6137931034482759,\n \"acc_stderr\": 0.04057324734419036,\n \"\
acc_norm\": 0.6137931034482759,\n \"acc_norm_stderr\": 0.04057324734419036\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.47354497354497355,\n \"acc_stderr\": 0.025715239811346758,\n \"\
acc_norm\": 0.47354497354497355,\n \"acc_norm_stderr\": 0.025715239811346758\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.044444444444444495,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.044444444444444495\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8096774193548387,\n \"acc_stderr\": 0.022331707611823078,\n \"\
acc_norm\": 0.8096774193548387,\n \"acc_norm_stderr\": 0.022331707611823078\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5024630541871922,\n \"acc_stderr\": 0.03517945038691063,\n \"\
acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\"\
: 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.806060606060606,\n \"acc_stderr\": 0.03087414513656209,\n\
\ \"acc_norm\": 0.806060606060606,\n \"acc_norm_stderr\": 0.03087414513656209\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8737373737373737,\n \"acc_stderr\": 0.02366435940288023,\n \"\
acc_norm\": 0.8737373737373737,\n \"acc_norm_stderr\": 0.02366435940288023\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563976,\n\
\ \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563976\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35555555555555557,\n \"acc_stderr\": 0.029185714949857406,\n \
\ \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.029185714949857406\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7226890756302521,\n \"acc_stderr\": 0.02907937453948001,\n \
\ \"acc_norm\": 0.7226890756302521,\n \"acc_norm_stderr\": 0.02907937453948001\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242741,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242741\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"\
acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5601851851851852,\n \"acc_stderr\": 0.0338517797604481,\n \"acc_norm\"\
: 0.5601851851851852,\n \"acc_norm_stderr\": 0.0338517797604481\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8480392156862745,\n\
\ \"acc_stderr\": 0.025195658428931796,\n \"acc_norm\": 0.8480392156862745,\n\
\ \"acc_norm_stderr\": 0.025195658428931796\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.8649789029535865,\n \"acc_stderr\": 0.022245776632003694,\n\
\ \"acc_norm\": 0.8649789029535865,\n \"acc_norm_stderr\": 0.022245776632003694\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.03114679648297246,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.03114679648297246\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306086,\n\
\ \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306086\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n\
\ \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8349514563106796,\n \"acc_stderr\": 0.03675668832233188,\n\
\ \"acc_norm\": 0.8349514563106796,\n \"acc_norm_stderr\": 0.03675668832233188\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n\
\ \"acc_stderr\": 0.0230866350868414,\n \"acc_norm\": 0.8547008547008547,\n\
\ \"acc_norm_stderr\": 0.0230866350868414\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8084291187739464,\n\
\ \"acc_stderr\": 0.014072859310451949,\n \"acc_norm\": 0.8084291187739464,\n\
\ \"acc_norm_stderr\": 0.014072859310451949\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7630057803468208,\n \"acc_stderr\": 0.02289408248992599,\n\
\ \"acc_norm\": 0.7630057803468208,\n \"acc_norm_stderr\": 0.02289408248992599\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39664804469273746,\n\
\ \"acc_stderr\": 0.016361354769822475,\n \"acc_norm\": 0.39664804469273746,\n\
\ \"acc_norm_stderr\": 0.016361354769822475\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7581699346405228,\n \"acc_stderr\": 0.024518195641879334,\n\
\ \"acc_norm\": 0.7581699346405228,\n \"acc_norm_stderr\": 0.024518195641879334\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7266881028938906,\n\
\ \"acc_stderr\": 0.025311765975426122,\n \"acc_norm\": 0.7266881028938906,\n\
\ \"acc_norm_stderr\": 0.025311765975426122\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7808641975308642,\n \"acc_stderr\": 0.023016705640262192,\n\
\ \"acc_norm\": 0.7808641975308642,\n \"acc_norm_stderr\": 0.023016705640262192\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5141843971631206,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.5141843971631206,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.49608865710560623,\n\
\ \"acc_stderr\": 0.012769845366441194,\n \"acc_norm\": 0.49608865710560623,\n\
\ \"acc_norm_stderr\": 0.012769845366441194\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7426470588235294,\n \"acc_stderr\": 0.0265565194700415,\n\
\ \"acc_norm\": 0.7426470588235294,\n \"acc_norm_stderr\": 0.0265565194700415\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6928104575163399,\n \"acc_stderr\": 0.01866335967146366,\n \
\ \"acc_norm\": 0.6928104575163399,\n \"acc_norm_stderr\": 0.01866335967146366\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.02812342933514278,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.02812342933514278\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \
\ \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5843373493975904,\n\
\ \"acc_stderr\": 0.03836722176598052,\n \"acc_norm\": 0.5843373493975904,\n\
\ \"acc_norm_stderr\": 0.03836722176598052\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.031267817146631786,\n\
\ \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.031267817146631786\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5679314565483476,\n\
\ \"mc1_stderr\": 0.017341202394988327,\n \"mc2\": 0.7195437123974021,\n\
\ \"mc2_stderr\": 0.01500878766115849\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8342541436464088,\n \"acc_stderr\": 0.010450899545370634\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6527672479150872,\n \
\ \"acc_stderr\": 0.013113898382146877\n }\n}\n```"
repo_url: https://huggingface.co/kodonho/Solar-OrcaDPO-Solar-Instruct-SLERP
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_13T17_32_35.779900
path:
- '**/details_harness|arc:challenge|25_2024-01-13T17-32-35.779900.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-13T17-32-35.779900.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_13T17_32_35.779900
path:
- '**/details_harness|gsm8k|5_2024-01-13T17-32-35.779900.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-13T17-32-35.779900.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_13T17_32_35.779900
path:
- '**/details_harness|hellaswag|10_2024-01-13T17-32-35.779900.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-13T17-32-35.779900.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_13T17_32_35.779900
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T17-32-35.779900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T17-32-35.779900.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T17-32-35.779900.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_13T17_32_35.779900
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T17-32-35.779900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T17-32-35.779900.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_13T17_32_35.779900
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T17-32-35.779900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T17-32-35.779900.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_13T17_32_35.779900
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T17-32-35.779900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T17-32-35.779900.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_13T17_32_35.779900
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T17-32-35.779900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T17-32-35.779900.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_13T17_32_35.779900
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T17-32-35.779900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T17-32-35.779900.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_13T17_32_35.779900
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T17-32-35.779900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T17-32-35.779900.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_13T17_32_35.779900
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T17-32-35.779900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T17-32-35.779900.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_13T17_32_35.779900
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T17-32-35.779900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T17-32-35.779900.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_13T17_32_35.779900
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T17-32-35.779900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T17-32-35.779900.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_13T17_32_35.779900
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T17-32-35.779900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T17-32-35.779900.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_13T17_32_35.779900
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T17-32-35.779900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T17-32-35.779900.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_13T17_32_35.779900
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T17-32-35.779900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T17-32-35.779900.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_13T17_32_35.779900
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T17-32-35.779900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T17-32-35.779900.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_13T17_32_35.779900
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T17-32-35.779900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T17-32-35.779900.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_13T17_32_35.779900
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T17-32-35.779900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T17-32-35.779900.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_13T17_32_35.779900
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T17-32-35.779900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T17-32-35.779900.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_13T17_32_35.779900
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T17-32-35.779900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T17-32-35.779900.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_13T17_32_35.779900
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T17-32-35.779900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T17-32-35.779900.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_13T17_32_35.779900
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T17-32-35.779900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T17-32-35.779900.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_13T17_32_35.779900
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T17-32-35.779900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T17-32-35.779900.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_13T17_32_35.779900
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T17-32-35.779900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T17-32-35.779900.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_13T17_32_35.779900
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T17-32-35.779900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T17-32-35.779900.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_13T17_32_35.779900
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T17-32-35.779900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T17-32-35.779900.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_13T17_32_35.779900
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T17-32-35.779900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T17-32-35.779900.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_13T17_32_35.779900
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T17-32-35.779900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T17-32-35.779900.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_13T17_32_35.779900
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T17-32-35.779900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T17-32-35.779900.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_13T17_32_35.779900
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T17-32-35.779900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T17-32-35.779900.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_13T17_32_35.779900
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T17-32-35.779900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T17-32-35.779900.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_13T17_32_35.779900
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T17-32-35.779900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T17-32-35.779900.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_13T17_32_35.779900
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T17-32-35.779900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T17-32-35.779900.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_13T17_32_35.779900
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T17-32-35.779900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T17-32-35.779900.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_13T17_32_35.779900
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T17-32-35.779900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T17-32-35.779900.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_13T17_32_35.779900
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T17-32-35.779900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T17-32-35.779900.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_13T17_32_35.779900
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T17-32-35.779900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T17-32-35.779900.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_13T17_32_35.779900
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T17-32-35.779900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T17-32-35.779900.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_13T17_32_35.779900
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T17-32-35.779900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T17-32-35.779900.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_13T17_32_35.779900
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T17-32-35.779900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T17-32-35.779900.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_13T17_32_35.779900
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T17-32-35.779900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T17-32-35.779900.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_13T17_32_35.779900
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-13T17-32-35.779900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-13T17-32-35.779900.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_13T17_32_35.779900
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T17-32-35.779900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T17-32-35.779900.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_13T17_32_35.779900
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T17-32-35.779900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T17-32-35.779900.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_13T17_32_35.779900
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T17-32-35.779900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T17-32-35.779900.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_13T17_32_35.779900
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T17-32-35.779900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T17-32-35.779900.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_13T17_32_35.779900
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T17-32-35.779900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T17-32-35.779900.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_13T17_32_35.779900
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T17-32-35.779900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T17-32-35.779900.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_13T17_32_35.779900
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T17-32-35.779900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T17-32-35.779900.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_13T17_32_35.779900
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T17-32-35.779900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T17-32-35.779900.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_13T17_32_35.779900
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T17-32-35.779900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T17-32-35.779900.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_13T17_32_35.779900
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T17-32-35.779900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T17-32-35.779900.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_13T17_32_35.779900
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T17-32-35.779900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T17-32-35.779900.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_13T17_32_35.779900
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T17-32-35.779900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T17-32-35.779900.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_13T17_32_35.779900
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T17-32-35.779900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T17-32-35.779900.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_13T17_32_35.779900
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T17-32-35.779900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T17-32-35.779900.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_13T17_32_35.779900
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T17-32-35.779900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T17-32-35.779900.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_13T17_32_35.779900
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T17-32-35.779900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T17-32-35.779900.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_13T17_32_35.779900
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T17-32-35.779900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T17-32-35.779900.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_13T17_32_35.779900
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T17-32-35.779900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T17-32-35.779900.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_13T17_32_35.779900
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-13T17-32-35.779900.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-13T17-32-35.779900.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_13T17_32_35.779900
path:
- '**/details_harness|winogrande|5_2024-01-13T17-32-35.779900.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-13T17-32-35.779900.parquet'
- config_name: results
data_files:
- split: 2024_01_13T17_32_35.779900
path:
- results_2024-01-13T17-32-35.779900.parquet
- split: latest
path:
- results_2024-01-13T17-32-35.779900.parquet
---
# Dataset Card for Evaluation run of kodonho/Solar-OrcaDPO-Solar-Instruct-SLERP
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [kodonho/Solar-OrcaDPO-Solar-Instruct-SLERP](https://huggingface.co/kodonho/Solar-OrcaDPO-Solar-Instruct-SLERP) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_kodonho__Solar-OrcaDPO-Solar-Instruct-SLERP",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T17:32:35.779900](https://huggingface.co/datasets/open-llm-leaderboard/details_kodonho__Solar-OrcaDPO-Solar-Instruct-SLERP/blob/main/results_2024-01-13T17-32-35.779900.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6659359197324706,
"acc_stderr": 0.03167249441105516,
"acc_norm": 0.6667077779566729,
"acc_norm_stderr": 0.032318448519432046,
"mc1": 0.5679314565483476,
"mc1_stderr": 0.017341202394988327,
"mc2": 0.7195437123974021,
"mc2_stderr": 0.01500878766115849
},
"harness|arc:challenge|25": {
"acc": 0.6808873720136519,
"acc_stderr": 0.013621696119173306,
"acc_norm": 0.7098976109215017,
"acc_norm_stderr": 0.013261573677520766
},
"harness|hellaswag|10": {
"acc": 0.7105158334993029,
"acc_stderr": 0.004525960965551707,
"acc_norm": 0.882194781915953,
"acc_norm_stderr": 0.003217184906847944
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7302631578947368,
"acc_stderr": 0.03611780560284898,
"acc_norm": 0.7302631578947368,
"acc_norm_stderr": 0.03611780560284898
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.02863723563980089,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.02863723563980089
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.03586879280080341,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.03586879280080341
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082636,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6340425531914894,
"acc_stderr": 0.031489558297455304,
"acc_norm": 0.6340425531914894,
"acc_norm_stderr": 0.031489558297455304
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6137931034482759,
"acc_stderr": 0.04057324734419036,
"acc_norm": 0.6137931034482759,
"acc_norm_stderr": 0.04057324734419036
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.47354497354497355,
"acc_stderr": 0.025715239811346758,
"acc_norm": 0.47354497354497355,
"acc_norm_stderr": 0.025715239811346758
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.044444444444444495,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.044444444444444495
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8096774193548387,
"acc_stderr": 0.022331707611823078,
"acc_norm": 0.8096774193548387,
"acc_norm_stderr": 0.022331707611823078
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.806060606060606,
"acc_stderr": 0.03087414513656209,
"acc_norm": 0.806060606060606,
"acc_norm_stderr": 0.03087414513656209
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8737373737373737,
"acc_stderr": 0.02366435940288023,
"acc_norm": 0.8737373737373737,
"acc_norm_stderr": 0.02366435940288023
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328973,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328973
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563976,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563976
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.029185714949857406,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.029185714949857406
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7226890756302521,
"acc_stderr": 0.02907937453948001,
"acc_norm": 0.7226890756302521,
"acc_norm_stderr": 0.02907937453948001
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242741,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242741
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.015405084393157074,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.015405084393157074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5601851851851852,
"acc_stderr": 0.0338517797604481,
"acc_norm": 0.5601851851851852,
"acc_norm_stderr": 0.0338517797604481
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931796,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931796
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8649789029535865,
"acc_stderr": 0.022245776632003694,
"acc_norm": 0.8649789029535865,
"acc_norm_stderr": 0.022245776632003694
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.03114679648297246,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.03114679648297246
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8349514563106796,
"acc_stderr": 0.03675668832233188,
"acc_norm": 0.8349514563106796,
"acc_norm_stderr": 0.03675668832233188
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.0230866350868414,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.0230866350868414
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8084291187739464,
"acc_stderr": 0.014072859310451949,
"acc_norm": 0.8084291187739464,
"acc_norm_stderr": 0.014072859310451949
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7630057803468208,
"acc_stderr": 0.02289408248992599,
"acc_norm": 0.7630057803468208,
"acc_norm_stderr": 0.02289408248992599
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.39664804469273746,
"acc_stderr": 0.016361354769822475,
"acc_norm": 0.39664804469273746,
"acc_norm_stderr": 0.016361354769822475
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7581699346405228,
"acc_stderr": 0.024518195641879334,
"acc_norm": 0.7581699346405228,
"acc_norm_stderr": 0.024518195641879334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7266881028938906,
"acc_stderr": 0.025311765975426122,
"acc_norm": 0.7266881028938906,
"acc_norm_stderr": 0.025311765975426122
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7808641975308642,
"acc_stderr": 0.023016705640262192,
"acc_norm": 0.7808641975308642,
"acc_norm_stderr": 0.023016705640262192
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5141843971631206,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.5141843971631206,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.49608865710560623,
"acc_stderr": 0.012769845366441194,
"acc_norm": 0.49608865710560623,
"acc_norm_stderr": 0.012769845366441194
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7426470588235294,
"acc_stderr": 0.0265565194700415,
"acc_norm": 0.7426470588235294,
"acc_norm_stderr": 0.0265565194700415
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6928104575163399,
"acc_stderr": 0.01866335967146366,
"acc_norm": 0.6928104575163399,
"acc_norm_stderr": 0.01866335967146366
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.02812342933514278,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.02812342933514278
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776334,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776334
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5843373493975904,
"acc_stderr": 0.03836722176598052,
"acc_norm": 0.5843373493975904,
"acc_norm_stderr": 0.03836722176598052
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.031267817146631786,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.031267817146631786
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5679314565483476,
"mc1_stderr": 0.017341202394988327,
"mc2": 0.7195437123974021,
"mc2_stderr": 0.01500878766115849
},
"harness|winogrande|5": {
"acc": 0.8342541436464088,
"acc_stderr": 0.010450899545370634
},
"harness|gsm8k|5": {
"acc": 0.6527672479150872,
"acc_stderr": 0.013113898382146877
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
girrajjangid/mini-platypus | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 28547714
num_examples: 5000
download_size: 17892667
dataset_size: 28547714
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_jilp00__Hermes-2-SOLAR-10.7B-Symbolic | ---
pretty_name: Evaluation run of jilp00/Hermes-2-SOLAR-10.7B-Symbolic
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jilp00/Hermes-2-SOLAR-10.7B-Symbolic](https://huggingface.co/jilp00/Hermes-2-SOLAR-10.7B-Symbolic)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jilp00__Hermes-2-SOLAR-10.7B-Symbolic\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-07T21:33:05.098650](https://huggingface.co/datasets/open-llm-leaderboard/details_jilp00__Hermes-2-SOLAR-10.7B-Symbolic/blob/main/results_2024-01-07T21-33-05.098650.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6426233455922059,\n\
\ \"acc_stderr\": 0.03178345115211833,\n \"acc_norm\": 0.6530283252823872,\n\
\ \"acc_norm_stderr\": 0.0324878445417635,\n \"mc1\": 0.3561811505507956,\n\
\ \"mc1_stderr\": 0.01676379072844634,\n \"mc2\": 0.5484982867197952,\n\
\ \"mc2_stderr\": 0.015225517770683289\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5639931740614335,\n \"acc_stderr\": 0.014491225699230914,\n\
\ \"acc_norm\": 0.6168941979522184,\n \"acc_norm_stderr\": 0.014206472661672874\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6030671181039634,\n\
\ \"acc_stderr\": 0.004882619484166602,\n \"acc_norm\": 0.8257319259111731,\n\
\ \"acc_norm_stderr\": 0.0037856457412359383\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5555555555555556,\n\
\ \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.5555555555555556,\n\
\ \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.68,\n\
\ \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n \
\ \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6566037735849056,\n \"acc_stderr\": 0.02922452646912479,\n\
\ \"acc_norm\": 0.6566037735849056,\n \"acc_norm_stderr\": 0.02922452646912479\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n\
\ \"acc_stderr\": 0.036430371689585475,\n \"acc_norm\": 0.6473988439306358,\n\
\ \"acc_norm_stderr\": 0.036430371689585475\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082635,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082635\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5872340425531914,\n \"acc_stderr\": 0.03218471141400351,\n\
\ \"acc_norm\": 0.5872340425531914,\n \"acc_norm_stderr\": 0.03218471141400351\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370333,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370333\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.47619047619047616,\n \"acc_stderr\": 0.025722097064388542,\n \"\
acc_norm\": 0.47619047619047616,\n \"acc_norm_stderr\": 0.025722097064388542\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.0442626668137991,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.0442626668137991\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n\
\ \"acc_stderr\": 0.023157879349083525,\n \"acc_norm\": 0.7903225806451613,\n\
\ \"acc_norm_stderr\": 0.023157879349083525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4630541871921182,\n \"acc_stderr\": 0.035083705204426656,\n\
\ \"acc_norm\": 0.4630541871921182,\n \"acc_norm_stderr\": 0.035083705204426656\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.031234752377721175,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.031234752377721175\n \
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8333333333333334,\n \"acc_stderr\": 0.026552207828215282,\n \"\
acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026552207828215282\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.917098445595855,\n \"acc_stderr\": 0.01989934131572178,\n\
\ \"acc_norm\": 0.917098445595855,\n \"acc_norm_stderr\": 0.01989934131572178\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563976,\n\
\ \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563976\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34444444444444444,\n \"acc_stderr\": 0.02897264888484427,\n \
\ \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.02897264888484427\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.029953823891887034,\n\
\ \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.029953823891887034\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8366972477064221,\n \"acc_stderr\": 0.015848255806501534,\n \"\
acc_norm\": 0.8366972477064221,\n \"acc_norm_stderr\": 0.015848255806501534\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5694444444444444,\n \"acc_stderr\": 0.03376922151252335,\n \"\
acc_norm\": 0.5694444444444444,\n \"acc_norm_stderr\": 0.03376922151252335\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8235294117647058,\n \"acc_stderr\": 0.026756401538078962,\n \"\
acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.026756401538078962\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8565400843881856,\n \"acc_stderr\": 0.022818291821017012,\n \
\ \"acc_norm\": 0.8565400843881856,\n \"acc_norm_stderr\": 0.022818291821017012\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7130044843049327,\n\
\ \"acc_stderr\": 0.030360379710291954,\n \"acc_norm\": 0.7130044843049327,\n\
\ \"acc_norm_stderr\": 0.030360379710291954\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306086,\n\
\ \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306086\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8264462809917356,\n \"acc_stderr\": 0.03457272836917671,\n \"\
acc_norm\": 0.8264462809917356,\n \"acc_norm_stderr\": 0.03457272836917671\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.03957835471980981,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.03957835471980981\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n\
\ \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.020588491316092368,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.020588491316092368\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \
\ \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8071519795657727,\n\
\ \"acc_stderr\": 0.014108533515757431,\n \"acc_norm\": 0.8071519795657727,\n\
\ \"acc_norm_stderr\": 0.014108533515757431\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6994219653179191,\n \"acc_stderr\": 0.0246853168672578,\n\
\ \"acc_norm\": 0.6994219653179191,\n \"acc_norm_stderr\": 0.0246853168672578\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4748603351955307,\n\
\ \"acc_stderr\": 0.016701350842682632,\n \"acc_norm\": 0.4748603351955307,\n\
\ \"acc_norm_stderr\": 0.016701350842682632\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.02463004897982478,\n\
\ \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.02463004897982478\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7234726688102894,\n\
\ \"acc_stderr\": 0.025403832978179622,\n \"acc_norm\": 0.7234726688102894,\n\
\ \"acc_norm_stderr\": 0.025403832978179622\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7623456790123457,\n \"acc_stderr\": 0.02368359183700855,\n\
\ \"acc_norm\": 0.7623456790123457,\n \"acc_norm_stderr\": 0.02368359183700855\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.48239895697522817,\n\
\ \"acc_stderr\": 0.012762321298823643,\n \"acc_norm\": 0.48239895697522817,\n\
\ \"acc_norm_stderr\": 0.012762321298823643\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462923,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462923\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6650326797385621,\n \"acc_stderr\": 0.01909422816700032,\n \
\ \"acc_norm\": 0.6650326797385621,\n \"acc_norm_stderr\": 0.01909422816700032\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7510204081632653,\n \"acc_stderr\": 0.027682979522960238,\n\
\ \"acc_norm\": 0.7510204081632653,\n \"acc_norm_stderr\": 0.027682979522960238\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n\
\ \"acc_stderr\": 0.024845753212306042,\n \"acc_norm\": 0.8557213930348259,\n\
\ \"acc_norm_stderr\": 0.024845753212306042\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466108,\n \
\ \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466108\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.03094445977853321,\n\
\ \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.03094445977853321\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3561811505507956,\n\
\ \"mc1_stderr\": 0.01676379072844634,\n \"mc2\": 0.5484982867197952,\n\
\ \"mc2_stderr\": 0.015225517770683289\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8074191002367798,\n \"acc_stderr\": 0.011082538847491906\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.13949962092494314,\n \
\ \"acc_stderr\": 0.009543426687191282\n }\n}\n```"
repo_url: https://huggingface.co/jilp00/Hermes-2-SOLAR-10.7B-Symbolic
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_07T21_33_05.098650
path:
- '**/details_harness|arc:challenge|25_2024-01-07T21-33-05.098650.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-07T21-33-05.098650.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_07T21_33_05.098650
path:
- '**/details_harness|gsm8k|5_2024-01-07T21-33-05.098650.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-07T21-33-05.098650.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_07T21_33_05.098650
path:
- '**/details_harness|hellaswag|10_2024-01-07T21-33-05.098650.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-07T21-33-05.098650.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_07T21_33_05.098650
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-07T21-33-05.098650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-07T21-33-05.098650.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-07T21-33-05.098650.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_07T21_33_05.098650
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T21-33-05.098650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T21-33-05.098650.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_07T21_33_05.098650
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-07T21-33-05.098650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-07T21-33-05.098650.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_07T21_33_05.098650
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-07T21-33-05.098650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-07T21-33-05.098650.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_07T21_33_05.098650
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T21-33-05.098650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T21-33-05.098650.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_07T21_33_05.098650
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T21-33-05.098650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T21-33-05.098650.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_07T21_33_05.098650
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-07T21-33-05.098650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-07T21-33-05.098650.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_07T21_33_05.098650
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T21-33-05.098650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T21-33-05.098650.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_07T21_33_05.098650
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T21-33-05.098650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T21-33-05.098650.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_07T21_33_05.098650
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T21-33-05.098650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T21-33-05.098650.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_07T21_33_05.098650
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T21-33-05.098650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T21-33-05.098650.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_07T21_33_05.098650
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-07T21-33-05.098650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-07T21-33-05.098650.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_07T21_33_05.098650
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-07T21-33-05.098650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-07T21-33-05.098650.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_07T21_33_05.098650
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T21-33-05.098650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T21-33-05.098650.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_07T21_33_05.098650
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-07T21-33-05.098650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-07T21-33-05.098650.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_07T21_33_05.098650
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T21-33-05.098650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T21-33-05.098650.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_07T21_33_05.098650
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T21-33-05.098650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T21-33-05.098650.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_07T21_33_05.098650
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T21-33-05.098650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T21-33-05.098650.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_07T21_33_05.098650
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-07T21-33-05.098650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-07T21-33-05.098650.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_07T21_33_05.098650
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T21-33-05.098650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T21-33-05.098650.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_07T21_33_05.098650
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T21-33-05.098650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T21-33-05.098650.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_07T21_33_05.098650
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T21-33-05.098650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T21-33-05.098650.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_07T21_33_05.098650
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T21-33-05.098650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T21-33-05.098650.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_07T21_33_05.098650
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T21-33-05.098650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T21-33-05.098650.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_07T21_33_05.098650
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T21-33-05.098650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T21-33-05.098650.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_07T21_33_05.098650
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T21-33-05.098650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T21-33-05.098650.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_07T21_33_05.098650
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T21-33-05.098650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T21-33-05.098650.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_07T21_33_05.098650
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T21-33-05.098650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T21-33-05.098650.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_07T21_33_05.098650
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T21-33-05.098650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T21-33-05.098650.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_07T21_33_05.098650
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T21-33-05.098650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T21-33-05.098650.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_07T21_33_05.098650
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T21-33-05.098650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T21-33-05.098650.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_07T21_33_05.098650
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T21-33-05.098650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T21-33-05.098650.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_07T21_33_05.098650
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T21-33-05.098650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T21-33-05.098650.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_07T21_33_05.098650
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-07T21-33-05.098650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-07T21-33-05.098650.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_07T21_33_05.098650
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T21-33-05.098650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T21-33-05.098650.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_07T21_33_05.098650
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-07T21-33-05.098650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-07T21-33-05.098650.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_07T21_33_05.098650
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T21-33-05.098650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T21-33-05.098650.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_07T21_33_05.098650
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T21-33-05.098650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T21-33-05.098650.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_07T21_33_05.098650
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T21-33-05.098650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T21-33-05.098650.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_07T21_33_05.098650
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-07T21-33-05.098650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-07T21-33-05.098650.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_07T21_33_05.098650
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-07T21-33-05.098650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-07T21-33-05.098650.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_07T21_33_05.098650
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T21-33-05.098650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T21-33-05.098650.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_07T21_33_05.098650
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T21-33-05.098650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T21-33-05.098650.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_07T21_33_05.098650
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T21-33-05.098650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T21-33-05.098650.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_07T21_33_05.098650
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T21-33-05.098650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T21-33-05.098650.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_07T21_33_05.098650
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-07T21-33-05.098650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-07T21-33-05.098650.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_07T21_33_05.098650
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-07T21-33-05.098650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-07T21-33-05.098650.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_07T21_33_05.098650
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-07T21-33-05.098650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-07T21-33-05.098650.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_07T21_33_05.098650
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T21-33-05.098650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T21-33-05.098650.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_07T21_33_05.098650
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-07T21-33-05.098650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-07T21-33-05.098650.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_07T21_33_05.098650
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T21-33-05.098650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T21-33-05.098650.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_07T21_33_05.098650
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T21-33-05.098650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T21-33-05.098650.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_07T21_33_05.098650
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-07T21-33-05.098650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-07T21-33-05.098650.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_07T21_33_05.098650
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-07T21-33-05.098650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-07T21-33-05.098650.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_07T21_33_05.098650
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-07T21-33-05.098650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-07T21-33-05.098650.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_07T21_33_05.098650
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T21-33-05.098650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T21-33-05.098650.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_07T21_33_05.098650
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-07T21-33-05.098650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-07T21-33-05.098650.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_07T21_33_05.098650
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-07T21-33-05.098650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-07T21-33-05.098650.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_07T21_33_05.098650
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-07T21-33-05.098650.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-07T21-33-05.098650.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_07T21_33_05.098650
path:
- '**/details_harness|winogrande|5_2024-01-07T21-33-05.098650.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-07T21-33-05.098650.parquet'
- config_name: results
data_files:
- split: 2024_01_07T21_33_05.098650
path:
- results_2024-01-07T21-33-05.098650.parquet
- split: latest
path:
- results_2024-01-07T21-33-05.098650.parquet
---
# Dataset Card for Evaluation run of jilp00/Hermes-2-SOLAR-10.7B-Symbolic
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [jilp00/Hermes-2-SOLAR-10.7B-Symbolic](https://huggingface.co/jilp00/Hermes-2-SOLAR-10.7B-Symbolic) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jilp00__Hermes-2-SOLAR-10.7B-Symbolic",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-07T21:33:05.098650](https://huggingface.co/datasets/open-llm-leaderboard/details_jilp00__Hermes-2-SOLAR-10.7B-Symbolic/blob/main/results_2024-01-07T21-33-05.098650.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6426233455922059,
"acc_stderr": 0.03178345115211833,
"acc_norm": 0.6530283252823872,
"acc_norm_stderr": 0.0324878445417635,
"mc1": 0.3561811505507956,
"mc1_stderr": 0.01676379072844634,
"mc2": 0.5484982867197952,
"mc2_stderr": 0.015225517770683289
},
"harness|arc:challenge|25": {
"acc": 0.5639931740614335,
"acc_stderr": 0.014491225699230914,
"acc_norm": 0.6168941979522184,
"acc_norm_stderr": 0.014206472661672874
},
"harness|hellaswag|10": {
"acc": 0.6030671181039634,
"acc_stderr": 0.004882619484166602,
"acc_norm": 0.8257319259111731,
"acc_norm_stderr": 0.0037856457412359383
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04292596718256981,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04292596718256981
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6566037735849056,
"acc_stderr": 0.02922452646912479,
"acc_norm": 0.6566037735849056,
"acc_norm_stderr": 0.02922452646912479
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.036430371689585475,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.036430371689585475
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082635,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082635
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5872340425531914,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.5872340425531914,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5175438596491229,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.5175438596491229,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370333,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370333
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.025722097064388542,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.025722097064388542
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.0442626668137991,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.0442626668137991
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083525,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4630541871921182,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.4630541871921182,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8,
"acc_stderr": 0.031234752377721175,
"acc_norm": 0.8,
"acc_norm_stderr": 0.031234752377721175
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026552207828215282,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026552207828215282
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.917098445595855,
"acc_stderr": 0.01989934131572178,
"acc_norm": 0.917098445595855,
"acc_norm_stderr": 0.01989934131572178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563976,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563976
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.02897264888484427,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.02897264888484427
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6932773109243697,
"acc_stderr": 0.029953823891887034,
"acc_norm": 0.6932773109243697,
"acc_norm_stderr": 0.029953823891887034
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8366972477064221,
"acc_stderr": 0.015848255806501534,
"acc_norm": 0.8366972477064221,
"acc_norm_stderr": 0.015848255806501534
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5694444444444444,
"acc_stderr": 0.03376922151252335,
"acc_norm": 0.5694444444444444,
"acc_norm_stderr": 0.03376922151252335
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.026756401538078962,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.026756401538078962
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8565400843881856,
"acc_stderr": 0.022818291821017012,
"acc_norm": 0.8565400843881856,
"acc_norm_stderr": 0.022818291821017012
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7130044843049327,
"acc_stderr": 0.030360379710291954,
"acc_norm": 0.7130044843049327,
"acc_norm_stderr": 0.030360379710291954
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8264462809917356,
"acc_stderr": 0.03457272836917671,
"acc_norm": 0.8264462809917356,
"acc_norm_stderr": 0.03457272836917671
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.03957835471980981,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.03957835471980981
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092368,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092368
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8071519795657727,
"acc_stderr": 0.014108533515757431,
"acc_norm": 0.8071519795657727,
"acc_norm_stderr": 0.014108533515757431
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6994219653179191,
"acc_stderr": 0.0246853168672578,
"acc_norm": 0.6994219653179191,
"acc_norm_stderr": 0.0246853168672578
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4748603351955307,
"acc_stderr": 0.016701350842682632,
"acc_norm": 0.4748603351955307,
"acc_norm_stderr": 0.016701350842682632
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.02463004897982478,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.02463004897982478
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7234726688102894,
"acc_stderr": 0.025403832978179622,
"acc_norm": 0.7234726688102894,
"acc_norm_stderr": 0.025403832978179622
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7623456790123457,
"acc_stderr": 0.02368359183700855,
"acc_norm": 0.7623456790123457,
"acc_norm_stderr": 0.02368359183700855
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.029820747191422473,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.029820747191422473
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.48239895697522817,
"acc_stderr": 0.012762321298823643,
"acc_norm": 0.48239895697522817,
"acc_norm_stderr": 0.012762321298823643
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.028245687391462923,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.028245687391462923
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6650326797385621,
"acc_stderr": 0.01909422816700032,
"acc_norm": 0.6650326797385621,
"acc_norm_stderr": 0.01909422816700032
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7510204081632653,
"acc_stderr": 0.027682979522960238,
"acc_norm": 0.7510204081632653,
"acc_norm_stderr": 0.027682979522960238
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.024845753212306042,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.024845753212306042
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.028762349126466108,
"acc_norm": 0.91,
"acc_norm_stderr": 0.028762349126466108
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7953216374269005,
"acc_stderr": 0.03094445977853321,
"acc_norm": 0.7953216374269005,
"acc_norm_stderr": 0.03094445977853321
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3561811505507956,
"mc1_stderr": 0.01676379072844634,
"mc2": 0.5484982867197952,
"mc2_stderr": 0.015225517770683289
},
"harness|winogrande|5": {
"acc": 0.8074191002367798,
"acc_stderr": 0.011082538847491906
},
"harness|gsm8k|5": {
"acc": 0.13949962092494314,
"acc_stderr": 0.009543426687191282
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
data-corentinv/dataset | ---
license: mit
---
|
aureliojafer/twitter_dataset_1710176283 | ---
dataset_info:
features:
- name: id
dtype: 'null'
- name: tweet_content
dtype: 'null'
- name: user_name
dtype: 'null'
- name: user_id
dtype: 'null'
- name: created_at
dtype: 'null'
- name: url
dtype: 'null'
- name: favourite_count
dtype: 'null'
- name: scraped_at
dtype: 'null'
- name: image_urls
dtype: 'null'
splits:
- name: train
num_bytes: 0
num_examples: 0
download_size: 2160
dataset_size: 0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
RafaelSimonn/tokenizergpt2-Simon | ---
license: mit
---
|
polymer/dolphin-only-gpt-4 | ---
license: apache-2.0
task_categories:
- text-generation
duplicated_from: ehartford/dolphin
---
Dolphin 🐬
https://erichartford.com/dolphin
## Dataset details
This dataset is an attempt to replicate the results of [Microsoft's Orca](https://www.microsoft.com/en-us/research/publication/orca-progressive-learning-from-complex-explanation-traces-of-gpt-4/)
Our dataset consists of:
- ~1 million of FLANv2 augmented with GPT-4 completions (flan1m-alpaca-uncensored.jsonl)
- ~3.5 million of FLANv2 augmented with GPT-3.5 completions (flan5m-alpaca-uncensored.jsonl)
We followed the submix and system prompt distribution outlined in the Orca paper. With a few exceptions. We included all 75k of CoT in the FLAN-1m dataset rather than sampling that. Also, we found that many items were duplicated, so we removed duplicates, resulting in 3.5m instructs in the ChatGPT dataset.
Then we filtered out instances of alignment, refusal, avoidance, and bias, in order to produce an uncensored model upon which can be layered your personalized alignment LoRA.
Token distribution for GPT-3.5 completions

### Loading
```python
## load GPT-4 completions
dataset = load_dataset("ehartford/dolphin",data_files="flan1m-alpaca-uncensored.jsonl")
## load GPT-3.5 completions
dataset = load_dataset("ehartford/dolphin",data_files="flan5m-alpaca-uncensored.jsonl")
```
This dataset is licensed apache-2.0 for commercial or non-commercial use.
We currently plan to release Dolphin on:
- Xgen 7b 8k
- LLaMA 13b (Non-commercial)
- MPT 30b 8k
- LLaMA 33b (Non-commercial)
- Falcon 40b
- LLaMA 65b (Non-commercial)
The Dolphin models that are released will be subject to the license of the foundational model on which it is trained. (LLaMA releases will be non-commercial)
I would like to thank the motley crew of Open Source AI/ML engineers who have worked beside me in this endeavor. Including:
- Wing "Caseus" Lian and NanoBit of OpenAccess AI Collective
- Rohan
- Teknium
- Pankaj Mathur
- Tom "TheBloke" Jobbins for quantizing and amplifying
- Special thanks to EdenCoder and chirper.ai for mentorship and financial sponsorship.
- Special thanks to Kilkonie for his very valued mentorship.
- All the other people in the Open Source AI community who have taught me and helped me along the way.
|
reginaboateng/Bioasq7b | ---
language: en
dataset_info:
features:
- name: context
dtype: string
- name: question
dtype: string
- name: id
dtype: string
- name: answers
dtype: string
splits:
- name: train
num_bytes: 9973215.098861594
num_examples: 6000
- name: validation
num_bytes: 1123648.9011384062
num_examples: 676
download_size: 6069060
dataset_size: 11096864.0
---
# Dataset Card for "Bioasq7b"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
RaphaelOlivier/whisper_adversarial_examples | ---
license: cc-by-4.0
---
# Description
This dataset is a subset of [LibriSpeech](https://huggingface.co/datasets/librispeech_asr) and Multilingual [CommonVoice](commonvoice.mozilla.org/) that have been adversarially modified to fool [Whisper](https://huggingface.co/openai/whisper-medium) ASR model.
Original [source code](https://github.com/RaphaelOlivier/whisper_attack).
The raw [tar files](https://data.mendeley.com/datasets/96dh52hz9r).
# Configurations and splits
* The `targeted` config contains targeted adversarial examples. When successful, they fool Whisper into predicting the sentence `OK Google, browse to evil.com` even if the input is entirely different. We provide a split for each Whisper model, and one containing the original, unmodified inputs
* The `untargeted-35` and `untargeted-40` configs contain untargeted adversarial examples, with average Signal-Noise Ratios of 35dB and 40dB respectively. They fool Whisper into predicting erroneous transcriptions. We provide a split for each Whisper model, and one containing the original, unmodified inputs
* The `language-<lang> configs contain adversarial examples in language <lang> that fool Whisper in predicting the wrong language. Split `<lang>.<target_lang>` contain inputs that Whisper perceives as <target_lang>, and split `<lang>.original` contains the original inputs in language <lang>. We use 3 target languages (English, Tagalog and Serbian) and 7 source languages (English, Italian, Indonesian, Danish, Czech, Lithuanian and Armenian).
# Usage
Here is an example of code using this dataset:
```python
model_name="whisper-medium"
config_name="targeted"
split_name="whisper.medium"
hub_path = "openai/whisper-"+model_name
processor = WhisperProcessor.from_pretrained(hub_path)
model = WhisperForConditionalGeneration.from_pretrained(hub_path).to("cuda")
dataset = load_dataset("RaphaelOlivier/whisper_adversarial_examples",config_name ,split=split_name)
def map_to_pred(batch):
input_features = processor(batch["audio"][0]["array"], return_tensors="pt").input_features
predicted_ids = model.generate(input_features.to("cuda"))
transcription = processor.batch_decode(predicted_ids, normalize = True)
batch['text'][0] = processor.tokenizer._normalize(batch['text'][0])
batch["transcription"] = transcription
return batch
result = dataset.map(map_to_pred, batched=True, batch_size=1)
wer = load("wer")
for t in zip(result["text"],result["transcription"]):
print(t)
print(wer.compute(predictions=result["text"], references=result["transcription"]))
``` |
Vichayturen/referee_model_train_test | ---
license: apache-2.0
---
|
breno30/VozMc | ---
license: openrail
---
|
alisson40889/RAMBO | ---
license: openrail
---
|
yagnikposhiya/CommonVoiceCorpusUrdu15 | ---
license: apache-2.0
---
|
huggingface/autotrain-data-fix-punctuation-attention | Invalid username or password. |
tasneem123/audios | ---
task_categories:
- audio-classification
language:
- ar
- en
size_categories:
- 1K<n<10K
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Aehus/moneybag | ---
dataset_info:
features:
- name: new_input
dtype: string
- name: new_output
dtype: string
- name: new_instruction
dtype: string
splits:
- name: train
num_bytes: 9154
num_examples: 10
download_size: 0
dataset_size: 9154
---
# Dataset Card for "moneybag"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tilyupo/marco_cqa | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 238025739
num_examples: 503370
- name: validation
num_bytes: 25810748
num_examples: 55636
download_size: 175452898
dataset_size: 263836487
---
# Dataset Card for "marco_cqa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DBQ/Net.a.Porter.Product.prices.Tunisia | ---
annotations_creators:
- other
language_creators:
- other
language:
- en
license:
- unknown
multilinguality:
- monolingual
source_datasets:
- original
task_categories:
- text-classification
- image-classification
- feature-extraction
- image-segmentation
- image-to-image
- image-to-text
- object-detection
- summarization
- zero-shot-image-classification
pretty_name: Tunisia - Net-a-Porter - Product-level price list
tags:
- webscraping
- ecommerce
- Net
- fashion
- fashion product
- image
- fashion image
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: website_name
dtype: string
- name: competence_date
dtype: string
- name: country_code
dtype: string
- name: currency_code
dtype: string
- name: brand
dtype: string
- name: category1_code
dtype: string
- name: category2_code
dtype: string
- name: category3_code
dtype: string
- name: product_code
dtype: int64
- name: title
dtype: string
- name: itemurl
dtype: string
- name: imageurl
dtype: string
- name: full_price
dtype: float64
- name: price
dtype: float64
- name: full_price_eur
dtype: float64
- name: price_eur
dtype: float64
- name: flg_discount
dtype: int64
splits:
- name: train
num_bytes: 17289050
num_examples: 42405
download_size: 5424439
dataset_size: 17289050
---
# Net-a-Porter web scraped data
## About the website
The **Net-a-Porter** brand operates within the highly competitive **Ecommerce** industry in the **EMEA** region, specifically in **Tunisia**. This industry encapsulates all buying and selling of goods or services online, with fashion and luxury products being a significant contributor. The growth of ecommerce platforms in Tunisia has led to a surge in the digital marketplace, playing a key role in boosting the economy of the country. Our dataset primarily revolves around **Ecommerce product-list page (PLP)** data of this renowned online retailer in Tunisia, offering insight into online customer behaviour and product performance.
## Link to **dataset**
[Tunisia - Net-a-Porter - Product-level price list dataset](https://www.databoutique.com/buy-data-page/Net-a-Porter%20Product-prices%20Tunisia/r/recyrW7jHdjmHA1Cr)
|
citibankdemobusiness/worldsrecord | ---
license: other
license_name: billionaire
license_link: https://github.com/CitibankDemoBusiness/billiondollars/blob/git/LICENSE
---
|
dariolopez/Llama-2-databricks-dolly-oasst1-es | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 18280331
num_examples: 18924
download_size: 10529271
dataset_size: 18280331
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
language:
- es
size_categories:
- 10K<n<100K
---
# Llama-2-databricks-dolly-oasst1-es
Union of https://huggingface.co/datasets/dariolopez/Llama-2-databricks-dolly-es and https://huggingface.co/datasets/dariolopez/Llama-2-oasst1-es |
heliosprime/twitter_dataset_1712981361 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 8832
num_examples: 19
download_size: 8597
dataset_size: 8832
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1712981361"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_openlm-research__open_llama_7b | ---
pretty_name: Evaluation run of openlm-research/open_llama_7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [openlm-research/open_llama_7b](https://huggingface.co/openlm-research/open_llama_7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_openlm-research__open_llama_7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-18T17:26:48.856271](https://huggingface.co/datasets/open-llm-leaderboard/details_openlm-research__open_llama_7b/blob/main/results_2023-10-18T17-26-48.856271.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0008389261744966443,\n\
\ \"em_stderr\": 0.00029649629898012564,\n \"f1\": 0.054966442953020285,\n\
\ \"f1_stderr\": 0.00134099148142866,\n \"acc\": 0.3477395817189483,\n\
\ \"acc_stderr\": 0.008281452365035358\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0008389261744966443,\n \"em_stderr\": 0.00029649629898012564,\n\
\ \"f1\": 0.054966442953020285,\n \"f1_stderr\": 0.00134099148142866\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.01592115238817286,\n \
\ \"acc_stderr\": 0.0034478192723890037\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6795580110497238,\n \"acc_stderr\": 0.013115085457681712\n\
\ }\n}\n```"
repo_url: https://huggingface.co/openlm-research/open_llama_7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_18T12_27_20.581564
path:
- '**/details_harness|arc:challenge|25_2023-07-18T12:27:20.581564.parquet'
- split: 2023_07_19T10_52_35.127282
path:
- '**/details_harness|arc:challenge|25_2023-07-19T10:52:35.127282.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T10:52:35.127282.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_16T17_12_53.113186
path:
- '**/details_harness|drop|3_2023-10-16T17-12-53.113186.parquet'
- split: 2023_10_18T17_26_48.856271
path:
- '**/details_harness|drop|3_2023-10-18T17-26-48.856271.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-18T17-26-48.856271.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_16T17_12_53.113186
path:
- '**/details_harness|gsm8k|5_2023-10-16T17-12-53.113186.parquet'
- split: 2023_10_18T17_26_48.856271
path:
- '**/details_harness|gsm8k|5_2023-10-18T17-26-48.856271.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-18T17-26-48.856271.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_18T12_27_20.581564
path:
- '**/details_harness|hellaswag|10_2023-07-18T12:27:20.581564.parquet'
- split: 2023_07_19T10_52_35.127282
path:
- '**/details_harness|hellaswag|10_2023-07-19T10:52:35.127282.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T10:52:35.127282.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_18T12_27_20.581564
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T12:27:20.581564.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T12:27:20.581564.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T12:27:20.581564.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T12:27:20.581564.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T12:27:20.581564.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T12:27:20.581564.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T12:27:20.581564.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T12:27:20.581564.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T12:27:20.581564.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T12:27:20.581564.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T12:27:20.581564.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T12:27:20.581564.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T12:27:20.581564.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T12:27:20.581564.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T12:27:20.581564.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T12:27:20.581564.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T12:27:20.581564.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T12:27:20.581564.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T12:27:20.581564.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T12:27:20.581564.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T12:27:20.581564.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T12:27:20.581564.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T12:27:20.581564.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T12:27:20.581564.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T12:27:20.581564.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T12:27:20.581564.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T12:27:20.581564.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T12:27:20.581564.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T12:27:20.581564.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T12:27:20.581564.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T12:27:20.581564.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T12:27:20.581564.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T12:27:20.581564.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T12:27:20.581564.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T12:27:20.581564.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T12:27:20.581564.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T12:27:20.581564.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T12:27:20.581564.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-18T12:27:20.581564.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T12:27:20.581564.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T12:27:20.581564.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T12:27:20.581564.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T12:27:20.581564.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T12:27:20.581564.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T12:27:20.581564.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T12:27:20.581564.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T12:27:20.581564.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T12:27:20.581564.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T12:27:20.581564.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T12:27:20.581564.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T12:27:20.581564.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T12:27:20.581564.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T12:27:20.581564.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T12:27:20.581564.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T12:27:20.581564.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T12:27:20.581564.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T12:27:20.581564.parquet'
- split: 2023_07_19T10_52_35.127282
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T10:52:35.127282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T10:52:35.127282.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T10:52:35.127282.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_18T12_27_20.581564
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T12:27:20.581564.parquet'
- split: 2023_07_19T10_52_35.127282
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T10:52:35.127282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T10:52:35.127282.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_18T12_27_20.581564
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T12:27:20.581564.parquet'
- split: 2023_07_19T10_52_35.127282
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T10:52:35.127282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T10:52:35.127282.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_18T12_27_20.581564
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T12:27:20.581564.parquet'
- split: 2023_07_19T10_52_35.127282
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T10:52:35.127282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T10:52:35.127282.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_18T12_27_20.581564
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T12:27:20.581564.parquet'
- split: 2023_07_19T10_52_35.127282
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T10:52:35.127282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T10:52:35.127282.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_18T12_27_20.581564
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T12:27:20.581564.parquet'
- split: 2023_07_19T10_52_35.127282
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T10:52:35.127282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T10:52:35.127282.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_18T12_27_20.581564
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T12:27:20.581564.parquet'
- split: 2023_07_19T10_52_35.127282
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T10:52:35.127282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T10:52:35.127282.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_18T12_27_20.581564
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T12:27:20.581564.parquet'
- split: 2023_07_19T10_52_35.127282
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T10:52:35.127282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T10:52:35.127282.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_18T12_27_20.581564
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T12:27:20.581564.parquet'
- split: 2023_07_19T10_52_35.127282
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T10:52:35.127282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T10:52:35.127282.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_18T12_27_20.581564
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T12:27:20.581564.parquet'
- split: 2023_07_19T10_52_35.127282
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T10:52:35.127282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T10:52:35.127282.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_18T12_27_20.581564
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T12:27:20.581564.parquet'
- split: 2023_07_19T10_52_35.127282
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T10:52:35.127282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T10:52:35.127282.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_18T12_27_20.581564
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T12:27:20.581564.parquet'
- split: 2023_07_19T10_52_35.127282
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T10:52:35.127282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T10:52:35.127282.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_18T12_27_20.581564
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T12:27:20.581564.parquet'
- split: 2023_07_19T10_52_35.127282
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T10:52:35.127282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T10:52:35.127282.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_18T12_27_20.581564
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T12:27:20.581564.parquet'
- split: 2023_07_19T10_52_35.127282
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T10:52:35.127282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T10:52:35.127282.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_18T12_27_20.581564
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T12:27:20.581564.parquet'
- split: 2023_07_19T10_52_35.127282
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T10:52:35.127282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T10:52:35.127282.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_18T12_27_20.581564
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T12:27:20.581564.parquet'
- split: 2023_07_19T10_52_35.127282
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T10:52:35.127282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T10:52:35.127282.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_18T12_27_20.581564
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T12:27:20.581564.parquet'
- split: 2023_07_19T10_52_35.127282
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T10:52:35.127282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T10:52:35.127282.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_18T12_27_20.581564
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T12:27:20.581564.parquet'
- split: 2023_07_19T10_52_35.127282
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T10:52:35.127282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T10:52:35.127282.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_18T12_27_20.581564
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T12:27:20.581564.parquet'
- split: 2023_07_19T10_52_35.127282
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T10:52:35.127282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T10:52:35.127282.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_18T12_27_20.581564
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T12:27:20.581564.parquet'
- split: 2023_07_19T10_52_35.127282
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T10:52:35.127282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T10:52:35.127282.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_18T12_27_20.581564
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T12:27:20.581564.parquet'
- split: 2023_07_19T10_52_35.127282
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T10:52:35.127282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T10:52:35.127282.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_18T12_27_20.581564
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T12:27:20.581564.parquet'
- split: 2023_07_19T10_52_35.127282
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T10:52:35.127282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T10:52:35.127282.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_18T12_27_20.581564
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T12:27:20.581564.parquet'
- split: 2023_07_19T10_52_35.127282
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T10:52:35.127282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T10:52:35.127282.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_18T12_27_20.581564
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T12:27:20.581564.parquet'
- split: 2023_07_19T10_52_35.127282
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T10:52:35.127282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T10:52:35.127282.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_18T12_27_20.581564
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T12:27:20.581564.parquet'
- split: 2023_07_19T10_52_35.127282
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T10:52:35.127282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T10:52:35.127282.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_18T12_27_20.581564
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T12:27:20.581564.parquet'
- split: 2023_07_19T10_52_35.127282
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T10:52:35.127282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T10:52:35.127282.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_18T12_27_20.581564
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T12:27:20.581564.parquet'
- split: 2023_07_19T10_52_35.127282
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T10:52:35.127282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T10:52:35.127282.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_18T12_27_20.581564
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T12:27:20.581564.parquet'
- split: 2023_07_19T10_52_35.127282
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T10:52:35.127282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T10:52:35.127282.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_18T12_27_20.581564
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T12:27:20.581564.parquet'
- split: 2023_07_19T10_52_35.127282
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T10:52:35.127282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T10:52:35.127282.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_18T12_27_20.581564
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T12:27:20.581564.parquet'
- split: 2023_07_19T10_52_35.127282
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T10:52:35.127282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T10:52:35.127282.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_18T12_27_20.581564
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T12:27:20.581564.parquet'
- split: 2023_07_19T10_52_35.127282
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T10:52:35.127282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T10:52:35.127282.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_18T12_27_20.581564
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T12:27:20.581564.parquet'
- split: 2023_07_19T10_52_35.127282
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T10:52:35.127282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T10:52:35.127282.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_18T12_27_20.581564
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T12:27:20.581564.parquet'
- split: 2023_07_19T10_52_35.127282
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T10:52:35.127282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T10:52:35.127282.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_18T12_27_20.581564
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T12:27:20.581564.parquet'
- split: 2023_07_19T10_52_35.127282
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T10:52:35.127282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T10:52:35.127282.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_18T12_27_20.581564
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T12:27:20.581564.parquet'
- split: 2023_07_19T10_52_35.127282
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T10:52:35.127282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T10:52:35.127282.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_18T12_27_20.581564
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T12:27:20.581564.parquet'
- split: 2023_07_19T10_52_35.127282
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T10:52:35.127282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T10:52:35.127282.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_18T12_27_20.581564
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T12:27:20.581564.parquet'
- split: 2023_07_19T10_52_35.127282
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T10:52:35.127282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T10:52:35.127282.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_18T12_27_20.581564
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T12:27:20.581564.parquet'
- split: 2023_07_19T10_52_35.127282
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T10:52:35.127282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T10:52:35.127282.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_18T12_27_20.581564
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T12:27:20.581564.parquet'
- split: 2023_07_19T10_52_35.127282
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T10:52:35.127282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T10:52:35.127282.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_18T12_27_20.581564
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-18T12:27:20.581564.parquet'
- split: 2023_07_19T10_52_35.127282
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T10:52:35.127282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T10:52:35.127282.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_18T12_27_20.581564
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T12:27:20.581564.parquet'
- split: 2023_07_19T10_52_35.127282
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T10:52:35.127282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T10:52:35.127282.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_18T12_27_20.581564
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T12:27:20.581564.parquet'
- split: 2023_07_19T10_52_35.127282
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T10:52:35.127282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T10:52:35.127282.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_18T12_27_20.581564
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T12:27:20.581564.parquet'
- split: 2023_07_19T10_52_35.127282
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T10:52:35.127282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T10:52:35.127282.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_18T12_27_20.581564
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T12:27:20.581564.parquet'
- split: 2023_07_19T10_52_35.127282
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T10:52:35.127282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T10:52:35.127282.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_18T12_27_20.581564
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T12:27:20.581564.parquet'
- split: 2023_07_19T10_52_35.127282
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T10:52:35.127282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T10:52:35.127282.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_18T12_27_20.581564
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T12:27:20.581564.parquet'
- split: 2023_07_19T10_52_35.127282
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T10:52:35.127282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T10:52:35.127282.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_18T12_27_20.581564
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T12:27:20.581564.parquet'
- split: 2023_07_19T10_52_35.127282
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T10:52:35.127282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T10:52:35.127282.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_18T12_27_20.581564
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T12:27:20.581564.parquet'
- split: 2023_07_19T10_52_35.127282
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T10:52:35.127282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T10:52:35.127282.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_18T12_27_20.581564
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T12:27:20.581564.parquet'
- split: 2023_07_19T10_52_35.127282
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T10:52:35.127282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T10:52:35.127282.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_18T12_27_20.581564
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T12:27:20.581564.parquet'
- split: 2023_07_19T10_52_35.127282
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T10:52:35.127282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T10:52:35.127282.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_18T12_27_20.581564
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T12:27:20.581564.parquet'
- split: 2023_07_19T10_52_35.127282
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T10:52:35.127282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T10:52:35.127282.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_18T12_27_20.581564
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T12:27:20.581564.parquet'
- split: 2023_07_19T10_52_35.127282
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T10:52:35.127282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T10:52:35.127282.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_18T12_27_20.581564
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T12:27:20.581564.parquet'
- split: 2023_07_19T10_52_35.127282
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T10:52:35.127282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T10:52:35.127282.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_18T12_27_20.581564
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T12:27:20.581564.parquet'
- split: 2023_07_19T10_52_35.127282
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T10:52:35.127282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T10:52:35.127282.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_18T12_27_20.581564
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T12:27:20.581564.parquet'
- split: 2023_07_19T10_52_35.127282
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T10:52:35.127282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T10:52:35.127282.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_18T12_27_20.581564
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T12:27:20.581564.parquet'
- split: 2023_07_19T10_52_35.127282
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T10:52:35.127282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T10:52:35.127282.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_18T12_27_20.581564
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T12:27:20.581564.parquet'
- split: 2023_07_19T10_52_35.127282
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T10:52:35.127282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T10:52:35.127282.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_18T12_27_20.581564
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T12:27:20.581564.parquet'
- split: 2023_07_19T10_52_35.127282
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T10:52:35.127282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T10:52:35.127282.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_18T12_27_20.581564
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-18T12:27:20.581564.parquet'
- split: 2023_07_19T10_52_35.127282
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T10:52:35.127282.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T10:52:35.127282.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_16T17_12_53.113186
path:
- '**/details_harness|winogrande|5_2023-10-16T17-12-53.113186.parquet'
- split: 2023_10_18T17_26_48.856271
path:
- '**/details_harness|winogrande|5_2023-10-18T17-26-48.856271.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-18T17-26-48.856271.parquet'
- config_name: results
data_files:
- split: 2023_07_18T12_27_20.581564
path:
- results_2023-07-18T12:27:20.581564.parquet
- split: 2023_07_19T10_52_35.127282
path:
- results_2023-07-19T10:52:35.127282.parquet
- split: 2023_10_16T17_12_53.113186
path:
- results_2023-10-16T17-12-53.113186.parquet
- split: 2023_10_18T17_26_48.856271
path:
- results_2023-10-18T17-26-48.856271.parquet
- split: latest
path:
- results_2023-10-18T17-26-48.856271.parquet
---
# Dataset Card for Evaluation run of openlm-research/open_llama_7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/openlm-research/open_llama_7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [openlm-research/open_llama_7b](https://huggingface.co/openlm-research/open_llama_7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_openlm-research__open_llama_7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-18T17:26:48.856271](https://huggingface.co/datasets/open-llm-leaderboard/details_openlm-research__open_llama_7b/blob/main/results_2023-10-18T17-26-48.856271.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0008389261744966443,
"em_stderr": 0.00029649629898012564,
"f1": 0.054966442953020285,
"f1_stderr": 0.00134099148142866,
"acc": 0.3477395817189483,
"acc_stderr": 0.008281452365035358
},
"harness|drop|3": {
"em": 0.0008389261744966443,
"em_stderr": 0.00029649629898012564,
"f1": 0.054966442953020285,
"f1_stderr": 0.00134099148142866
},
"harness|gsm8k|5": {
"acc": 0.01592115238817286,
"acc_stderr": 0.0034478192723890037
},
"harness|winogrande|5": {
"acc": 0.6795580110497238,
"acc_stderr": 0.013115085457681712
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
ktrinh38/bandier | ---
dataset_info:
features:
- name: folder
dtype: string
- name: path
dtype: string
- name: image
dtype: image
splits:
- name: train
num_bytes: 572697611.476
num_examples: 1604
download_size: 564763352
dataset_size: 572697611.476
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
nishakathiriya/images | ---
task_categories:
- image-classification
language:
- en
size_categories:
- n<1K
--- |
open-llm-leaderboard/details_allknowingroger__MultiverseEx26-7B-slerp | ---
pretty_name: Evaluation run of allknowingroger/MultiverseEx26-7B-slerp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [allknowingroger/MultiverseEx26-7B-slerp](https://huggingface.co/allknowingroger/MultiverseEx26-7B-slerp)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_allknowingroger__MultiverseEx26-7B-slerp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-10T21:45:43.672625](https://huggingface.co/datasets/open-llm-leaderboard/details_allknowingroger__MultiverseEx26-7B-slerp/blob/main/results_2024-04-10T21-45-43.672625.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6503713327803462,\n\
\ \"acc_stderr\": 0.032074423019835756,\n \"acc_norm\": 0.6492277917540636,\n\
\ \"acc_norm_stderr\": 0.03275225998907174,\n \"mc1\": 0.6352509179926561,\n\
\ \"mc1_stderr\": 0.016850961061720134,\n \"mc2\": 0.7812190256049953,\n\
\ \"mc2_stderr\": 0.013671524144533839\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7098976109215017,\n \"acc_stderr\": 0.013261573677520766,\n\
\ \"acc_norm\": 0.7295221843003413,\n \"acc_norm_stderr\": 0.012980954547659556\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.717486556462856,\n\
\ \"acc_stderr\": 0.004493015945599716,\n \"acc_norm\": 0.8916550487950607,\n\
\ \"acc_norm_stderr\": 0.0031018035745563103\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.02794321998933713,\n\
\ \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.02794321998933713\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\"\
: 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082636,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082636\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n\
\ \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4074074074074074,\n \"acc_stderr\": 0.02530590624159063,\n \"\
acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.02530590624159063\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n\
\ \"acc_stderr\": 0.023540799358723295,\n \"acc_norm\": 0.7806451612903226,\n\
\ \"acc_norm_stderr\": 0.023540799358723295\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.03287666758603491,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.03287666758603491\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.02860620428922987,\n \"acc_norm\"\
: 0.797979797979798,\n \"acc_norm_stderr\": 0.02860620428922987\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n\
\ \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.0303883535518868,\n \
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.0303883535518868\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719197,\n \"\
acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719197\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8422018348623853,\n \"acc_stderr\": 0.01563002297009244,\n \"\
acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.01563002297009244\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455335,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455335\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944856,\n \
\ \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944856\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159465,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159465\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n\
\ \"acc_stderr\": 0.046695106638751906,\n \"acc_norm\": 0.4107142857142857,\n\
\ \"acc_norm_stderr\": 0.046695106638751906\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.822477650063857,\n\
\ \"acc_stderr\": 0.013664230995834841,\n \"acc_norm\": 0.822477650063857,\n\
\ \"acc_norm_stderr\": 0.013664230995834841\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.02394851290546836,\n\
\ \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.02394851290546836\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.42793296089385474,\n\
\ \"acc_stderr\": 0.01654788799741611,\n \"acc_norm\": 0.42793296089385474,\n\
\ \"acc_norm_stderr\": 0.01654788799741611\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.02582916327275748,\n\
\ \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.02582916327275748\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\
\ \"acc_stderr\": 0.026003301117885135,\n \"acc_norm\": 0.7009646302250804,\n\
\ \"acc_norm_stderr\": 0.026003301117885135\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.0242885336377261,\n\
\ \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.0242885336377261\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47327249022164275,\n\
\ \"acc_stderr\": 0.01275197796767601,\n \"acc_norm\": 0.47327249022164275,\n\
\ \"acc_norm_stderr\": 0.01275197796767601\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.02824568739146292,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.02824568739146292\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6748366013071896,\n \"acc_stderr\": 0.018950886770806318,\n \
\ \"acc_norm\": 0.6748366013071896,\n \"acc_norm_stderr\": 0.018950886770806318\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784596,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784596\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6352509179926561,\n\
\ \"mc1_stderr\": 0.016850961061720134,\n \"mc2\": 0.7812190256049953,\n\
\ \"mc2_stderr\": 0.013671524144533839\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8516179952644041,\n \"acc_stderr\": 0.009990706005184136\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7103866565579985,\n \
\ \"acc_stderr\": 0.01249392734865963\n }\n}\n```"
repo_url: https://huggingface.co/allknowingroger/MultiverseEx26-7B-slerp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_10T21_45_43.672625
path:
- '**/details_harness|arc:challenge|25_2024-04-10T21-45-43.672625.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-10T21-45-43.672625.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_10T21_45_43.672625
path:
- '**/details_harness|gsm8k|5_2024-04-10T21-45-43.672625.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-10T21-45-43.672625.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_10T21_45_43.672625
path:
- '**/details_harness|hellaswag|10_2024-04-10T21-45-43.672625.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-10T21-45-43.672625.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_10T21_45_43.672625
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-10T21-45-43.672625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-10T21-45-43.672625.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-10T21-45-43.672625.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_10T21_45_43.672625
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-10T21-45-43.672625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-10T21-45-43.672625.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_10T21_45_43.672625
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-10T21-45-43.672625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-10T21-45-43.672625.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_10T21_45_43.672625
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-10T21-45-43.672625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-10T21-45-43.672625.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_10T21_45_43.672625
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-10T21-45-43.672625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-10T21-45-43.672625.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_10T21_45_43.672625
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-10T21-45-43.672625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-10T21-45-43.672625.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_10T21_45_43.672625
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-10T21-45-43.672625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-10T21-45-43.672625.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_10T21_45_43.672625
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-10T21-45-43.672625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-10T21-45-43.672625.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_10T21_45_43.672625
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-10T21-45-43.672625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-10T21-45-43.672625.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_10T21_45_43.672625
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-10T21-45-43.672625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-10T21-45-43.672625.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_10T21_45_43.672625
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-10T21-45-43.672625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-10T21-45-43.672625.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_10T21_45_43.672625
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-10T21-45-43.672625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-10T21-45-43.672625.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_10T21_45_43.672625
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-10T21-45-43.672625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-10T21-45-43.672625.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_10T21_45_43.672625
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-10T21-45-43.672625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-10T21-45-43.672625.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_10T21_45_43.672625
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-10T21-45-43.672625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-10T21-45-43.672625.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_10T21_45_43.672625
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-10T21-45-43.672625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-10T21-45-43.672625.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_10T21_45_43.672625
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-10T21-45-43.672625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-10T21-45-43.672625.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_10T21_45_43.672625
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-10T21-45-43.672625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-10T21-45-43.672625.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_10T21_45_43.672625
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-10T21-45-43.672625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-10T21-45-43.672625.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_10T21_45_43.672625
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-10T21-45-43.672625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-10T21-45-43.672625.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_10T21_45_43.672625
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-10T21-45-43.672625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-10T21-45-43.672625.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_10T21_45_43.672625
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-10T21-45-43.672625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-10T21-45-43.672625.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_10T21_45_43.672625
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-10T21-45-43.672625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-10T21-45-43.672625.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_10T21_45_43.672625
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-10T21-45-43.672625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-10T21-45-43.672625.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_10T21_45_43.672625
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-10T21-45-43.672625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-10T21-45-43.672625.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_10T21_45_43.672625
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-10T21-45-43.672625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-10T21-45-43.672625.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_10T21_45_43.672625
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-10T21-45-43.672625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-10T21-45-43.672625.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_10T21_45_43.672625
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-10T21-45-43.672625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-10T21-45-43.672625.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_10T21_45_43.672625
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-10T21-45-43.672625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-10T21-45-43.672625.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_10T21_45_43.672625
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-10T21-45-43.672625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-10T21-45-43.672625.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_10T21_45_43.672625
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-10T21-45-43.672625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-10T21-45-43.672625.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_10T21_45_43.672625
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-10T21-45-43.672625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-10T21-45-43.672625.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_10T21_45_43.672625
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-10T21-45-43.672625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-10T21-45-43.672625.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_10T21_45_43.672625
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-10T21-45-43.672625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-10T21-45-43.672625.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_10T21_45_43.672625
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-10T21-45-43.672625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-10T21-45-43.672625.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_10T21_45_43.672625
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-10T21-45-43.672625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-10T21-45-43.672625.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_10T21_45_43.672625
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-10T21-45-43.672625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-10T21-45-43.672625.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_10T21_45_43.672625
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-10T21-45-43.672625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-10T21-45-43.672625.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_10T21_45_43.672625
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-10T21-45-43.672625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-10T21-45-43.672625.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_10T21_45_43.672625
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-10T21-45-43.672625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-10T21-45-43.672625.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_10T21_45_43.672625
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-10T21-45-43.672625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-10T21-45-43.672625.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_10T21_45_43.672625
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-10T21-45-43.672625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-10T21-45-43.672625.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_10T21_45_43.672625
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-10T21-45-43.672625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-10T21-45-43.672625.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_10T21_45_43.672625
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-10T21-45-43.672625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-10T21-45-43.672625.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_10T21_45_43.672625
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-10T21-45-43.672625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-10T21-45-43.672625.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_10T21_45_43.672625
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-10T21-45-43.672625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-10T21-45-43.672625.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_10T21_45_43.672625
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-10T21-45-43.672625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-10T21-45-43.672625.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_10T21_45_43.672625
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-10T21-45-43.672625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-10T21-45-43.672625.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_10T21_45_43.672625
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-10T21-45-43.672625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-10T21-45-43.672625.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_10T21_45_43.672625
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-10T21-45-43.672625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-10T21-45-43.672625.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_10T21_45_43.672625
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-10T21-45-43.672625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-10T21-45-43.672625.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_10T21_45_43.672625
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-10T21-45-43.672625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-10T21-45-43.672625.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_10T21_45_43.672625
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-10T21-45-43.672625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-10T21-45-43.672625.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_10T21_45_43.672625
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-10T21-45-43.672625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-10T21-45-43.672625.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_10T21_45_43.672625
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-10T21-45-43.672625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-10T21-45-43.672625.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_10T21_45_43.672625
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-10T21-45-43.672625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-10T21-45-43.672625.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_10T21_45_43.672625
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-10T21-45-43.672625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-10T21-45-43.672625.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_10T21_45_43.672625
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-10T21-45-43.672625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-10T21-45-43.672625.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_10T21_45_43.672625
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-10T21-45-43.672625.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-10T21-45-43.672625.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_10T21_45_43.672625
path:
- '**/details_harness|winogrande|5_2024-04-10T21-45-43.672625.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-10T21-45-43.672625.parquet'
- config_name: results
data_files:
- split: 2024_04_10T21_45_43.672625
path:
- results_2024-04-10T21-45-43.672625.parquet
- split: latest
path:
- results_2024-04-10T21-45-43.672625.parquet
---
# Dataset Card for Evaluation run of allknowingroger/MultiverseEx26-7B-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [allknowingroger/MultiverseEx26-7B-slerp](https://huggingface.co/allknowingroger/MultiverseEx26-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_allknowingroger__MultiverseEx26-7B-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-10T21:45:43.672625](https://huggingface.co/datasets/open-llm-leaderboard/details_allknowingroger__MultiverseEx26-7B-slerp/blob/main/results_2024-04-10T21-45-43.672625.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6503713327803462,
"acc_stderr": 0.032074423019835756,
"acc_norm": 0.6492277917540636,
"acc_norm_stderr": 0.03275225998907174,
"mc1": 0.6352509179926561,
"mc1_stderr": 0.016850961061720134,
"mc2": 0.7812190256049953,
"mc2_stderr": 0.013671524144533839
},
"harness|arc:challenge|25": {
"acc": 0.7098976109215017,
"acc_stderr": 0.013261573677520766,
"acc_norm": 0.7295221843003413,
"acc_norm_stderr": 0.012980954547659556
},
"harness|hellaswag|10": {
"acc": 0.717486556462856,
"acc_stderr": 0.004493015945599716,
"acc_norm": 0.8916550487950607,
"acc_norm_stderr": 0.0031018035745563103
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.02794321998933713,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.02794321998933713
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082636,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.02530590624159063,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.02530590624159063
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723295,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.03287666758603491,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.03287666758603491
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.02860620428922987,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.02860620428922987
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328973,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328973
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.0303883535518868,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.0303883535518868
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3841059602649007,
"acc_stderr": 0.03971301814719197,
"acc_norm": 0.3841059602649007,
"acc_norm_stderr": 0.03971301814719197
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.01563002297009244,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.01563002297009244
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455335,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455335
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.025310495376944856,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.025310495376944856
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159465,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159465
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4107142857142857,
"acc_stderr": 0.046695106638751906,
"acc_norm": 0.4107142857142857,
"acc_norm_stderr": 0.046695106638751906
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.822477650063857,
"acc_stderr": 0.013664230995834841,
"acc_norm": 0.822477650063857,
"acc_norm_stderr": 0.013664230995834841
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.02394851290546836,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.02394851290546836
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.42793296089385474,
"acc_stderr": 0.01654788799741611,
"acc_norm": 0.42793296089385474,
"acc_norm_stderr": 0.01654788799741611
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.02582916327275748,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.02582916327275748
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.026003301117885135,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.026003301117885135
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7438271604938271,
"acc_stderr": 0.0242885336377261,
"acc_norm": 0.7438271604938271,
"acc_norm_stderr": 0.0242885336377261
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47327249022164275,
"acc_stderr": 0.01275197796767601,
"acc_norm": 0.47327249022164275,
"acc_norm_stderr": 0.01275197796767601
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.02824568739146292,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.02824568739146292
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6748366013071896,
"acc_stderr": 0.018950886770806318,
"acc_norm": 0.6748366013071896,
"acc_norm_stderr": 0.018950886770806318
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784596,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784596
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6352509179926561,
"mc1_stderr": 0.016850961061720134,
"mc2": 0.7812190256049953,
"mc2_stderr": 0.013671524144533839
},
"harness|winogrande|5": {
"acc": 0.8516179952644041,
"acc_stderr": 0.009990706005184136
},
"harness|gsm8k|5": {
"acc": 0.7103866565579985,
"acc_stderr": 0.01249392734865963
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
argmaxinc/mlx-bench-regression-tests | ---
viewer: false
---
# MLX Bench
- Benchmarks standard workloads such as [text generation with Mistral-7b](https://github.com/ml-explore/mlx-examples/tree/main/llms/mistral) across commits and forks
- Automatically generated by [https://github.com/argmaxinc/mlx-bench](https://github.com/argmaxinc/mlx-bench)
- Logs performance and correctness test results on various Apple Silicon Macs |
plaguss/curation-ultrafeedback-scores | ---
dataset_info:
features:
- name: source
dtype: string
- name: instruction
dtype: string
- name: best_rated_is_different_from_best_overall
dtype: bool
- name: best_overall_model
dtype: string
- name: score_best_overall
dtype: float64
- name: best_rated_model
dtype: string
- name: score_best_rated
dtype: float64
- name: best_overall_score_response
struct:
- name: annotations
struct:
- name: helpfulness
struct:
- name: Rating
dtype: string
- name: Rationale
dtype: string
- name: Rationale For Rating
dtype: string
- name: Type
sequence: string
- name: honesty
struct:
- name: Rating
dtype: string
- name: Rationale
dtype: string
- name: instruction_following
struct:
- name: Rating
dtype: string
- name: Rationale
dtype: string
- name: truthfulness
struct:
- name: Rating
dtype: string
- name: Rationale
dtype: string
- name: Rationale For Rating
dtype: string
- name: Type
sequence: string
- name: average_rating
dtype: float64
- name: critique
dtype: string
- name: custom_system_prompt
dtype: string
- name: model
dtype: string
- name: overall_score
dtype: float64
- name: principle
dtype: string
- name: response
dtype: string
- name: random_response_for_best_overall
struct:
- name: annotations
struct:
- name: helpfulness
struct:
- name: Rating
dtype: string
- name: Rationale
dtype: string
- name: Rationale For Rating
dtype: string
- name: Type
sequence: string
- name: honesty
struct:
- name: Rating
dtype: string
- name: Rationale
dtype: string
- name: instruction_following
struct:
- name: Rating
dtype: string
- name: Rationale
dtype: string
- name: truthfulness
struct:
- name: Rating
dtype: string
- name: Rationale
dtype: string
- name: Rationale For Rating
dtype: string
- name: Type
sequence: string
- name: average_rating
dtype: float64
- name: critique
dtype: string
- name: custom_system_prompt
dtype: string
- name: model
dtype: string
- name: overall_score
dtype: float64
- name: principle
dtype: string
- name: response
dtype: string
- name: best_rated_response
struct:
- name: annotations
struct:
- name: helpfulness
struct:
- name: Rating
dtype: string
- name: Rationale
dtype: string
- name: Rationale For Rating
dtype: string
- name: Type
sequence: string
- name: honesty
struct:
- name: Rating
dtype: string
- name: Rationale
dtype: string
- name: instruction_following
struct:
- name: Rating
dtype: string
- name: Rationale
dtype: string
- name: truthfulness
struct:
- name: Rating
dtype: string
- name: Rationale
dtype: string
- name: Rationale For Rating
dtype: string
- name: Type
sequence: string
- name: average_rating
dtype: float64
- name: critique
dtype: string
- name: custom_system_prompt
dtype: string
- name: model
dtype: string
- name: overall_score
dtype: float64
- name: principle
dtype: string
- name: response
dtype: string
- name: random_response_for_best_rated
struct:
- name: annotations
struct:
- name: helpfulness
struct:
- name: Rating
dtype: string
- name: Rationale
dtype: string
- name: Rationale For Rating
dtype: string
- name: Type
sequence: string
- name: honesty
struct:
- name: Rating
dtype: string
- name: Rationale
dtype: string
- name: instruction_following
struct:
- name: Rating
dtype: string
- name: Rationale
dtype: string
- name: truthfulness
struct:
- name: Rating
dtype: string
- name: Rationale
dtype: string
- name: Rationale For Rating
dtype: string
- name: Type
sequence: string
- name: average_rating
dtype: float64
- name: critique
dtype: string
- name: custom_system_prompt
dtype: string
- name: model
dtype: string
- name: overall_score
dtype: float64
- name: principle
dtype: string
- name: response
dtype: string
- name: score_random_response_for_best_overall
dtype: float64
- name: score_random_response_for_rated
dtype: float64
- name: completions
list:
- name: annotations
struct:
- name: helpfulness
struct:
- name: Rating
dtype: string
- name: Rationale
dtype: string
- name: Rationale For Rating
dtype: string
- name: Type
sequence: string
- name: honesty
struct:
- name: Rating
dtype: string
- name: Rationale
dtype: string
- name: instruction_following
struct:
- name: Rating
dtype: string
- name: Rationale
dtype: string
- name: truthfulness
struct:
- name: Rating
dtype: string
- name: Rationale
dtype: string
- name: Rationale For Rating
dtype: string
- name: Type
sequence: string
- name: average_rating
dtype: float64
- name: critique
dtype: string
- name: custom_system_prompt
dtype: string
- name: model
dtype: string
- name: overall_score
dtype: float64
- name: principle
dtype: string
- name: response
dtype: string
- name: random_response_for_rated
dtype: float64
- name: rating-distilabel-gpt4
sequence: float64
- name: rationale-distilabel-gpt4
sequence: string
splits:
- name: train
num_bytes: 46976035
num_examples: 2405
download_size: 18006660
dataset_size: 46976035
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
betogaunt/minhasvozes.zip | ---
license: openrail
---
|
pipXBT/dxdx_market_data | ---
license: apache-2.0
---
|
DjSteker/dataset_train__id_url_title_text | ---
dataset_info:
features:
- name: id
dtype: float64
- name: url
dtype: float64
- name: title
dtype: float64
- name: text
dtype: float64
splits:
- name: train
num_bytes: 0
num_examples: 0
download_size: 1091
dataset_size: 0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
edbeeching/prj_gia_dataset_mujoco_pendulum_1111 | ---
library_name: gia
tags:
- deep-reinforcement-learning
- reinforcement-learning
- gia
- multi-task
- multi-modal
- imitation-learning
- offline-reinforcement-learning
---
An imitation learning environment for the mujoco_pendulum environment, sample for the policy mujoco_pendulum_1111
This environment was created as part of the Generally Intelligent Agents project gia: https://github.com/huggingface/gia
|
Prarabdha/Paul_RNA_Sequence_Processed_Dataset | ---
license: mit
---
|
ByteResearch/role-play-240405 | ---
license: apache-2.0
---
|
AmelieSchreiber/pha_clustered_protein_complexes_40K | ---
license: mit
---
|
Ammok/apple_stock_price_from_1980-2021 | ---
license: odc-by
task_categories:
- time-series-forecasting
- tabular-regression
language:
- en
pretty_name: apple stock price from 1980-2021
--- |
CVasNLPExperiments/VQAv2_sample_validation_google_flan_t5_xxl_mode_A_T_CM_Q_rices_ns_1000 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: question
dtype: string
- name: true_label
sequence: string
- name: prediction
dtype: string
splits:
- name: fewshot_0_clip_tags_LAION_ViT_H_14_2B_with_openai_Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full_DETA_detections_deta_swin_large_o365_coco_classes_caption_all_patches_Salesforce_blip_image_captioning_large_clean_
num_bytes: 141914
num_examples: 1000
download_size: 53553
dataset_size: 141914
---
# Dataset Card for "VQAv2_sample_validation_google_flan_t5_xxl_mode_A_T_CM_Q_rices_ns_1000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_FlagAlpha__Llama2-Chinese-13b-Chat | ---
pretty_name: Evaluation run of FlagAlpha/Llama2-Chinese-13b-Chat
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [FlagAlpha/Llama2-Chinese-13b-Chat](https://huggingface.co/FlagAlpha/Llama2-Chinese-13b-Chat)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_FlagAlpha__Llama2-Chinese-13b-Chat\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-13T06:13:18.397506](https://huggingface.co/datasets/open-llm-leaderboard/details_FlagAlpha__Llama2-Chinese-13b-Chat/blob/main/results_2023-10-13T06-13-18.397506.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.3886325503355705,\n\
\ \"em_stderr\": 0.004991836977358219,\n \"f1\": 0.4460098573825516,\n\
\ \"f1_stderr\": 0.004836724027731064,\n \"acc\": 0.4437472960609105,\n\
\ \"acc_stderr\": 0.010555580633054316\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.3886325503355705,\n \"em_stderr\": 0.004991836977358219,\n\
\ \"f1\": 0.4460098573825516,\n \"f1_stderr\": 0.004836724027731064\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.12585291887793784,\n \
\ \"acc_stderr\": 0.009136212598406319\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7616416732438832,\n \"acc_stderr\": 0.011974948667702313\n\
\ }\n}\n```"
repo_url: https://huggingface.co/FlagAlpha/Llama2-Chinese-13b-Chat
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_17T19_12_34.146693
path:
- '**/details_harness|arc:challenge|25_2023-08-17T19:12:34.146693.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-17T19:12:34.146693.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_13T06_13_18.397506
path:
- '**/details_harness|drop|3_2023-10-13T06-13-18.397506.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-13T06-13-18.397506.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_13T06_13_18.397506
path:
- '**/details_harness|gsm8k|5_2023-10-13T06-13-18.397506.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-13T06-13-18.397506.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_17T19_12_34.146693
path:
- '**/details_harness|hellaswag|10_2023-08-17T19:12:34.146693.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-17T19:12:34.146693.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_17T19_12_34.146693
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T19:12:34.146693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T19:12:34.146693.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T19:12:34.146693.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_17T19_12_34.146693
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T19:12:34.146693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T19:12:34.146693.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_17T19_12_34.146693
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T19:12:34.146693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T19:12:34.146693.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_17T19_12_34.146693
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T19:12:34.146693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T19:12:34.146693.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_17T19_12_34.146693
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T19:12:34.146693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T19:12:34.146693.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_17T19_12_34.146693
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T19:12:34.146693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T19:12:34.146693.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_17T19_12_34.146693
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T19:12:34.146693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T19:12:34.146693.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_17T19_12_34.146693
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T19:12:34.146693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T19:12:34.146693.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_17T19_12_34.146693
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T19:12:34.146693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T19:12:34.146693.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_17T19_12_34.146693
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T19:12:34.146693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T19:12:34.146693.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_17T19_12_34.146693
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T19:12:34.146693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T19:12:34.146693.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_17T19_12_34.146693
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T19:12:34.146693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T19:12:34.146693.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_17T19_12_34.146693
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T19:12:34.146693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T19:12:34.146693.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_17T19_12_34.146693
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T19:12:34.146693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T19:12:34.146693.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_17T19_12_34.146693
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T19:12:34.146693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T19:12:34.146693.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_17T19_12_34.146693
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T19:12:34.146693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T19:12:34.146693.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_17T19_12_34.146693
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T19:12:34.146693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T19:12:34.146693.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_17T19_12_34.146693
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T19:12:34.146693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T19:12:34.146693.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_17T19_12_34.146693
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T19:12:34.146693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T19:12:34.146693.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_17T19_12_34.146693
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T19:12:34.146693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T19:12:34.146693.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_17T19_12_34.146693
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T19:12:34.146693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T19:12:34.146693.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_17T19_12_34.146693
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T19:12:34.146693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T19:12:34.146693.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_17T19_12_34.146693
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T19:12:34.146693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T19:12:34.146693.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_17T19_12_34.146693
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T19:12:34.146693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T19:12:34.146693.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_17T19_12_34.146693
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T19:12:34.146693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T19:12:34.146693.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_17T19_12_34.146693
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T19:12:34.146693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T19:12:34.146693.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_17T19_12_34.146693
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T19:12:34.146693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T19:12:34.146693.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_17T19_12_34.146693
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T19:12:34.146693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T19:12:34.146693.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_17T19_12_34.146693
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T19:12:34.146693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T19:12:34.146693.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_17T19_12_34.146693
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T19:12:34.146693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T19:12:34.146693.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_17T19_12_34.146693
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T19:12:34.146693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T19:12:34.146693.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_17T19_12_34.146693
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T19:12:34.146693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T19:12:34.146693.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_17T19_12_34.146693
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T19:12:34.146693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T19:12:34.146693.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_17T19_12_34.146693
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T19:12:34.146693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T19:12:34.146693.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_17T19_12_34.146693
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T19:12:34.146693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T19:12:34.146693.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_17T19_12_34.146693
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T19:12:34.146693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T19:12:34.146693.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_17T19_12_34.146693
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T19:12:34.146693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T19:12:34.146693.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_17T19_12_34.146693
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T19:12:34.146693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T19:12:34.146693.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_17T19_12_34.146693
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T19:12:34.146693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T19:12:34.146693.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_17T19_12_34.146693
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T19:12:34.146693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T19:12:34.146693.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_17T19_12_34.146693
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T19:12:34.146693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T19:12:34.146693.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_17T19_12_34.146693
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T19:12:34.146693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T19:12:34.146693.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_17T19_12_34.146693
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T19:12:34.146693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T19:12:34.146693.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_17T19_12_34.146693
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T19:12:34.146693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T19:12:34.146693.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_17T19_12_34.146693
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T19:12:34.146693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T19:12:34.146693.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_17T19_12_34.146693
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T19:12:34.146693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T19:12:34.146693.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_17T19_12_34.146693
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T19:12:34.146693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T19:12:34.146693.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_17T19_12_34.146693
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T19:12:34.146693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T19:12:34.146693.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_17T19_12_34.146693
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T19:12:34.146693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T19:12:34.146693.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_17T19_12_34.146693
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T19:12:34.146693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T19:12:34.146693.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_17T19_12_34.146693
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T19:12:34.146693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T19:12:34.146693.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_17T19_12_34.146693
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T19:12:34.146693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T19:12:34.146693.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_17T19_12_34.146693
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T19:12:34.146693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T19:12:34.146693.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_17T19_12_34.146693
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T19:12:34.146693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T19:12:34.146693.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_17T19_12_34.146693
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T19:12:34.146693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T19:12:34.146693.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_17T19_12_34.146693
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T19:12:34.146693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T19:12:34.146693.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_17T19_12_34.146693
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T19:12:34.146693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T19:12:34.146693.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_17T19_12_34.146693
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T19:12:34.146693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T19:12:34.146693.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_17T19_12_34.146693
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T19:12:34.146693.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T19:12:34.146693.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_13T06_13_18.397506
path:
- '**/details_harness|winogrande|5_2023-10-13T06-13-18.397506.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-13T06-13-18.397506.parquet'
- config_name: results
data_files:
- split: 2023_08_17T19_12_34.146693
path:
- results_2023-08-17T19:12:34.146693.parquet
- split: 2023_10_13T06_13_18.397506
path:
- results_2023-10-13T06-13-18.397506.parquet
- split: latest
path:
- results_2023-10-13T06-13-18.397506.parquet
---
# Dataset Card for Evaluation run of FlagAlpha/Llama2-Chinese-13b-Chat
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/FlagAlpha/Llama2-Chinese-13b-Chat
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [FlagAlpha/Llama2-Chinese-13b-Chat](https://huggingface.co/FlagAlpha/Llama2-Chinese-13b-Chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_FlagAlpha__Llama2-Chinese-13b-Chat",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-13T06:13:18.397506](https://huggingface.co/datasets/open-llm-leaderboard/details_FlagAlpha__Llama2-Chinese-13b-Chat/blob/main/results_2023-10-13T06-13-18.397506.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.3886325503355705,
"em_stderr": 0.004991836977358219,
"f1": 0.4460098573825516,
"f1_stderr": 0.004836724027731064,
"acc": 0.4437472960609105,
"acc_stderr": 0.010555580633054316
},
"harness|drop|3": {
"em": 0.3886325503355705,
"em_stderr": 0.004991836977358219,
"f1": 0.4460098573825516,
"f1_stderr": 0.004836724027731064
},
"harness|gsm8k|5": {
"acc": 0.12585291887793784,
"acc_stderr": 0.009136212598406319
},
"harness|winogrande|5": {
"acc": 0.7616416732438832,
"acc_stderr": 0.011974948667702313
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
evkes/required-deloitte-jobs | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 539333
num_examples: 327
download_size: 205214
dataset_size: 539333
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "required-deloitte-jobs"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Sayed121/ControlNetPublic | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
- name: detected_map
dtype: image
splits:
- name: train
num_bytes: 18451910.0
num_examples: 40
download_size: 18440309
dataset_size: 18451910.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
shwetha729/quantum-machine-learning | ---
license: gpl
---
a continuous data scrape of arxiv and google scholar papers of quantum machine learning papers particularly regarding climate. |
reshinthadith/2048_has_code_filtered_base_code_review_python_based_on_property | ---
dataset_info:
features:
- name: body
dtype: string
- name: comments
list:
- name: ContentLicense
dtype: string
- name: CreationDate
dtype: string
- name: Id
dtype: string
- name: Score
dtype: string
- name: body
dtype: string
- name: meta_data
struct:
- name: AcceptedAnswerId
dtype: string
- name: CommentCount
dtype: string
- name: ContentLicense
dtype: string
- name: CreationDate
dtype: string
- name: Id
dtype: string
- name: Score
dtype: string
- name: Tags
sequence: string
- name: Title
dtype: string
- name: question_id
dtype: string
- name: yield
dtype: string
- name: answers
list:
- name: body
dtype: string
- name: comments
list:
- name: ContentLicense
dtype: string
- name: CreationDate
dtype: string
- name: Id
dtype: string
- name: Score
dtype: string
- name: body
dtype: string
- name: meta_data
struct:
- name: CommentCount
dtype: string
- name: ContentLicense
dtype: string
- name: CreationDate
dtype: string
- name: Id
dtype: string
- name: ParentId
dtype: string
- name: Score
dtype: string
splits:
- name: train
num_bytes: 28462610
num_examples: 6398
download_size: 0
dataset_size: 28462610
---
# Dataset Card for "2048_has_code_filtered_base_code_review_python_based_on_property"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
parsi-ai-nlpclass/Sharif-Pors-AllQuAD | ---
dataset_info:
features:
- name: context
dtype: string
- name: title
dtype: string
- name: queries
sequence: string
splits:
- name: train
num_bytes: 53361742
num_examples: 31665
- name: test
num_bytes: 6382486
num_examples: 3425
- name: val
num_bytes: 5847951
num_examples: 3519
download_size: 32353283
dataset_size: 65592179
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: val
path: data/val-*
---
|
open-llm-leaderboard/details_nisten__shqiponja-15b-v1 | ---
pretty_name: Evaluation run of nisten/shqiponja-15b-v1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [nisten/shqiponja-15b-v1](https://huggingface.co/nisten/shqiponja-15b-v1) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_nisten__shqiponja-15b-v1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-09T14:57:48.901535](https://huggingface.co/datasets/open-llm-leaderboard/details_nisten__shqiponja-15b-v1/blob/main/results_2024-02-09T14-57-48.901535.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6502238830390134,\n\
\ \"acc_stderr\": 0.03202691421399621,\n \"acc_norm\": 0.6499858115249134,\n\
\ \"acc_norm_stderr\": 0.03269775340356268,\n \"mc1\": 0.3953488372093023,\n\
\ \"mc1_stderr\": 0.017115815632418197,\n \"mc2\": 0.5681041768987346,\n\
\ \"mc2_stderr\": 0.015360715175436088\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6313993174061433,\n \"acc_stderr\": 0.014097810678042194,\n\
\ \"acc_norm\": 0.6638225255972696,\n \"acc_norm_stderr\": 0.013804855026205761\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6682931686914957,\n\
\ \"acc_stderr\": 0.004698640688271197,\n \"acc_norm\": 0.8526190001991635,\n\
\ \"acc_norm_stderr\": 0.003537608501069177\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.037385206761196686,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.037385206761196686\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.027943219989337124,\n\
\ \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.027943219989337124\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\
\ \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n\
\ \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5872340425531914,\n \"acc_stderr\": 0.03218471141400351,\n\
\ \"acc_norm\": 0.5872340425531914,\n \"acc_norm_stderr\": 0.03218471141400351\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n\
\ \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42063492063492064,\n \"acc_stderr\": 0.025424835086923996,\n \"\
acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.025424835086923996\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n\
\ \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n\
\ \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7483870967741936,\n \"acc_stderr\": 0.02468597928623996,\n \"\
acc_norm\": 0.7483870967741936,\n \"acc_norm_stderr\": 0.02468597928623996\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n \"\
acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.02912652283458682,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.02912652283458682\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n\
\ \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3592592592592593,\n \"acc_stderr\": 0.029252905927251972,\n \
\ \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.029252905927251972\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7016806722689075,\n \"acc_stderr\": 0.02971914287634286,\n \
\ \"acc_norm\": 0.7016806722689075,\n \"acc_norm_stderr\": 0.02971914287634286\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8422018348623853,\n \"acc_stderr\": 0.015630022970092427,\n \"\
acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.015630022970092427\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"\
acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8235294117647058,\n \"acc_stderr\": 0.026756401538078962,\n \"\
acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.026756401538078962\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8227848101265823,\n \"acc_stderr\": 0.024856364184503224,\n \
\ \"acc_norm\": 0.8227848101265823,\n \"acc_norm_stderr\": 0.024856364184503224\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n\
\ \"acc_stderr\": 0.030636591348699803,\n \"acc_norm\": 0.7040358744394619,\n\
\ \"acc_norm_stderr\": 0.030636591348699803\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.03351953879521271,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.03351953879521271\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8339719029374202,\n\
\ \"acc_stderr\": 0.0133064782430663,\n \"acc_norm\": 0.8339719029374202,\n\
\ \"acc_norm_stderr\": 0.0133064782430663\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.023703099525258172,\n\
\ \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.023703099525258172\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.36983240223463687,\n\
\ \"acc_stderr\": 0.016145881256056212,\n \"acc_norm\": 0.36983240223463687,\n\
\ \"acc_norm_stderr\": 0.016145881256056212\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292456,\n\
\ \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292456\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n\
\ \"acc_stderr\": 0.02558306248998481,\n \"acc_norm\": 0.7170418006430869,\n\
\ \"acc_norm_stderr\": 0.02558306248998481\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.023993501709042117,\n\
\ \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.023993501709042117\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4574468085106383,\n \"acc_stderr\": 0.029719281272236848,\n \
\ \"acc_norm\": 0.4574468085106383,\n \"acc_norm_stderr\": 0.029719281272236848\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4654498044328553,\n\
\ \"acc_stderr\": 0.012739711554045704,\n \"acc_norm\": 0.4654498044328553,\n\
\ \"acc_norm_stderr\": 0.012739711554045704\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.02806499816704009,\n\
\ \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.02806499816704009\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6862745098039216,\n \"acc_stderr\": 0.01877168389352818,\n \
\ \"acc_norm\": 0.6862745098039216,\n \"acc_norm_stderr\": 0.01877168389352818\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142773,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142773\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n\
\ \"acc_stderr\": 0.024845753212306053,\n \"acc_norm\": 0.8557213930348259,\n\
\ \"acc_norm_stderr\": 0.024845753212306053\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3953488372093023,\n\
\ \"mc1_stderr\": 0.017115815632418197,\n \"mc2\": 0.5681041768987346,\n\
\ \"mc2_stderr\": 0.015360715175436088\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.840568271507498,\n \"acc_stderr\": 0.010288617479454764\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.690674753601213,\n \
\ \"acc_stderr\": 0.012731710925078138\n }\n}\n```"
repo_url: https://huggingface.co/nisten/shqiponja-15b-v1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_09T14_57_48.901535
path:
- '**/details_harness|arc:challenge|25_2024-02-09T14-57-48.901535.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-09T14-57-48.901535.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_09T14_57_48.901535
path:
- '**/details_harness|gsm8k|5_2024-02-09T14-57-48.901535.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-09T14-57-48.901535.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_09T14_57_48.901535
path:
- '**/details_harness|hellaswag|10_2024-02-09T14-57-48.901535.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-09T14-57-48.901535.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_09T14_57_48.901535
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T14-57-48.901535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T14-57-48.901535.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T14-57-48.901535.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_09T14_57_48.901535
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T14-57-48.901535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T14-57-48.901535.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_09T14_57_48.901535
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T14-57-48.901535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T14-57-48.901535.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_09T14_57_48.901535
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T14-57-48.901535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T14-57-48.901535.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_09T14_57_48.901535
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T14-57-48.901535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T14-57-48.901535.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_09T14_57_48.901535
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T14-57-48.901535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T14-57-48.901535.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_09T14_57_48.901535
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T14-57-48.901535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T14-57-48.901535.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_09T14_57_48.901535
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T14-57-48.901535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T14-57-48.901535.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_09T14_57_48.901535
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T14-57-48.901535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T14-57-48.901535.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_09T14_57_48.901535
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T14-57-48.901535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T14-57-48.901535.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_09T14_57_48.901535
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T14-57-48.901535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T14-57-48.901535.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_09T14_57_48.901535
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T14-57-48.901535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T14-57-48.901535.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_09T14_57_48.901535
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T14-57-48.901535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T14-57-48.901535.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_09T14_57_48.901535
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T14-57-48.901535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T14-57-48.901535.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_09T14_57_48.901535
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T14-57-48.901535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T14-57-48.901535.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_09T14_57_48.901535
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T14-57-48.901535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T14-57-48.901535.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_09T14_57_48.901535
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T14-57-48.901535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T14-57-48.901535.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_09T14_57_48.901535
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T14-57-48.901535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T14-57-48.901535.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_09T14_57_48.901535
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T14-57-48.901535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T14-57-48.901535.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_09T14_57_48.901535
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T14-57-48.901535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T14-57-48.901535.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_09T14_57_48.901535
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T14-57-48.901535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T14-57-48.901535.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_09T14_57_48.901535
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T14-57-48.901535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T14-57-48.901535.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_09T14_57_48.901535
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T14-57-48.901535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T14-57-48.901535.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_09T14_57_48.901535
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T14-57-48.901535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T14-57-48.901535.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_09T14_57_48.901535
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T14-57-48.901535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T14-57-48.901535.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_09T14_57_48.901535
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T14-57-48.901535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T14-57-48.901535.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_09T14_57_48.901535
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T14-57-48.901535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T14-57-48.901535.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_09T14_57_48.901535
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T14-57-48.901535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T14-57-48.901535.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_09T14_57_48.901535
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T14-57-48.901535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T14-57-48.901535.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_09T14_57_48.901535
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T14-57-48.901535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T14-57-48.901535.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_09T14_57_48.901535
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T14-57-48.901535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T14-57-48.901535.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_09T14_57_48.901535
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T14-57-48.901535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T14-57-48.901535.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_09T14_57_48.901535
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T14-57-48.901535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T14-57-48.901535.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_09T14_57_48.901535
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T14-57-48.901535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T14-57-48.901535.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_09T14_57_48.901535
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T14-57-48.901535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T14-57-48.901535.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_09T14_57_48.901535
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T14-57-48.901535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T14-57-48.901535.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_09T14_57_48.901535
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T14-57-48.901535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T14-57-48.901535.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_09T14_57_48.901535
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T14-57-48.901535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T14-57-48.901535.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_09T14_57_48.901535
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T14-57-48.901535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T14-57-48.901535.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_09T14_57_48.901535
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-09T14-57-48.901535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-09T14-57-48.901535.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_09T14_57_48.901535
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T14-57-48.901535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T14-57-48.901535.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_09T14_57_48.901535
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T14-57-48.901535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T14-57-48.901535.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_09T14_57_48.901535
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T14-57-48.901535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T14-57-48.901535.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_09T14_57_48.901535
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T14-57-48.901535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T14-57-48.901535.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_09T14_57_48.901535
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T14-57-48.901535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T14-57-48.901535.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_09T14_57_48.901535
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T14-57-48.901535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T14-57-48.901535.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_09T14_57_48.901535
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T14-57-48.901535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T14-57-48.901535.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_09T14_57_48.901535
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T14-57-48.901535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T14-57-48.901535.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_09T14_57_48.901535
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T14-57-48.901535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T14-57-48.901535.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_09T14_57_48.901535
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T14-57-48.901535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T14-57-48.901535.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_09T14_57_48.901535
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T14-57-48.901535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T14-57-48.901535.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_09T14_57_48.901535
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T14-57-48.901535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T14-57-48.901535.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_09T14_57_48.901535
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T14-57-48.901535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T14-57-48.901535.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_09T14_57_48.901535
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T14-57-48.901535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T14-57-48.901535.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_09T14_57_48.901535
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T14-57-48.901535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T14-57-48.901535.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_09T14_57_48.901535
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T14-57-48.901535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T14-57-48.901535.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_09T14_57_48.901535
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T14-57-48.901535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T14-57-48.901535.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_09T14_57_48.901535
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T14-57-48.901535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T14-57-48.901535.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_09T14_57_48.901535
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-09T14-57-48.901535.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-09T14-57-48.901535.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_09T14_57_48.901535
path:
- '**/details_harness|winogrande|5_2024-02-09T14-57-48.901535.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-09T14-57-48.901535.parquet'
- config_name: results
data_files:
- split: 2024_02_09T14_57_48.901535
path:
- results_2024-02-09T14-57-48.901535.parquet
- split: latest
path:
- results_2024-02-09T14-57-48.901535.parquet
---
# Dataset Card for Evaluation run of nisten/shqiponja-15b-v1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [nisten/shqiponja-15b-v1](https://huggingface.co/nisten/shqiponja-15b-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_nisten__shqiponja-15b-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-09T14:57:48.901535](https://huggingface.co/datasets/open-llm-leaderboard/details_nisten__shqiponja-15b-v1/blob/main/results_2024-02-09T14-57-48.901535.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6502238830390134,
"acc_stderr": 0.03202691421399621,
"acc_norm": 0.6499858115249134,
"acc_norm_stderr": 0.03269775340356268,
"mc1": 0.3953488372093023,
"mc1_stderr": 0.017115815632418197,
"mc2": 0.5681041768987346,
"mc2_stderr": 0.015360715175436088
},
"harness|arc:challenge|25": {
"acc": 0.6313993174061433,
"acc_stderr": 0.014097810678042194,
"acc_norm": 0.6638225255972696,
"acc_norm_stderr": 0.013804855026205761
},
"harness|hellaswag|10": {
"acc": 0.6682931686914957,
"acc_stderr": 0.004698640688271197,
"acc_norm": 0.8526190001991635,
"acc_norm_stderr": 0.003537608501069177
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.037385206761196686,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.037385206761196686
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.027943219989337124,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.027943219989337124
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932263,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932263
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5872340425531914,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.5872340425531914,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.025424835086923996,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.025424835086923996
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7483870967741936,
"acc_stderr": 0.02468597928623996,
"acc_norm": 0.7483870967741936,
"acc_norm_stderr": 0.02468597928623996
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.02912652283458682,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.02912652283458682
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3592592592592593,
"acc_stderr": 0.029252905927251972,
"acc_norm": 0.3592592592592593,
"acc_norm_stderr": 0.029252905927251972
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7016806722689075,
"acc_stderr": 0.02971914287634286,
"acc_norm": 0.7016806722689075,
"acc_norm_stderr": 0.02971914287634286
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.015630022970092427,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.015630022970092427
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.026756401538078962,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.026756401538078962
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8227848101265823,
"acc_stderr": 0.024856364184503224,
"acc_norm": 0.8227848101265823,
"acc_norm_stderr": 0.024856364184503224
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7040358744394619,
"acc_stderr": 0.030636591348699803,
"acc_norm": 0.7040358744394619,
"acc_norm_stderr": 0.030636591348699803
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.03351953879521271,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.03351953879521271
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8339719029374202,
"acc_stderr": 0.0133064782430663,
"acc_norm": 0.8339719029374202,
"acc_norm_stderr": 0.0133064782430663
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.023703099525258172,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.023703099525258172
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.36983240223463687,
"acc_stderr": 0.016145881256056212,
"acc_norm": 0.36983240223463687,
"acc_norm_stderr": 0.016145881256056212
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.738562091503268,
"acc_stderr": 0.025160998214292456,
"acc_norm": 0.738562091503268,
"acc_norm_stderr": 0.025160998214292456
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.02558306248998481,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.02558306248998481
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.023993501709042117,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.023993501709042117
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4574468085106383,
"acc_stderr": 0.029719281272236848,
"acc_norm": 0.4574468085106383,
"acc_norm_stderr": 0.029719281272236848
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4654498044328553,
"acc_stderr": 0.012739711554045704,
"acc_norm": 0.4654498044328553,
"acc_norm_stderr": 0.012739711554045704
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6911764705882353,
"acc_stderr": 0.02806499816704009,
"acc_norm": 0.6911764705882353,
"acc_norm_stderr": 0.02806499816704009
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6862745098039216,
"acc_stderr": 0.01877168389352818,
"acc_norm": 0.6862745098039216,
"acc_norm_stderr": 0.01877168389352818
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142773,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142773
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.024845753212306053,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.024845753212306053
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3953488372093023,
"mc1_stderr": 0.017115815632418197,
"mc2": 0.5681041768987346,
"mc2_stderr": 0.015360715175436088
},
"harness|winogrande|5": {
"acc": 0.840568271507498,
"acc_stderr": 0.010288617479454764
},
"harness|gsm8k|5": {
"acc": 0.690674753601213,
"acc_stderr": 0.012731710925078138
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
kaleemWaheed/twitter_dataset_1713042561 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 11489
num_examples: 27
download_size: 9456
dataset_size: 11489
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
gawoon/mnist-v1 | ---
dataset_info:
features:
- name: image
sequence: float64
- name: label
dtype: int32
splits:
- name: train
num_bytes: 439600000
num_examples: 70000
download_size: 16169837
dataset_size: 439600000
---
# Dataset Card for "mnist-v1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_mrpc_if_would | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 9280
num_examples: 35
- name: train
num_bytes: 20300
num_examples: 73
- name: validation
num_bytes: 2076
num_examples: 7
download_size: 32453
dataset_size: 31656
---
# Dataset Card for "MULTI_VALUE_mrpc_if_would"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
parambharat/kannada_asr_corpus | ---
annotations_creators:
- found
language:
- kn
language_creators:
- found
license:
- cc-by-4.0
multilinguality:
- monolingual
pretty_name: Kannada ASR Corpus
size_categories:
- 100K<n<1M
source_datasets:
- extended|openslr
tags: []
task_categories:
- automatic-speech-recognition
task_ids: []
---
# Dataset Card for [Kannada Asr Corpus]
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
[More Information Needed]
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
Thanks to [@parambharat](https://github.com/parambharat) for adding this dataset. |
swaroopajit/next-dataset-refined-batch-9000 | ---
dataset_info:
features:
- name: caption
dtype: string
- name: image
dtype: image
splits:
- name: train
num_bytes: 335325495.0
num_examples: 1000
download_size: 309863965
dataset_size: 335325495.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "next-dataset-refined-batch-9000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
alexjercan/AoC | ---
license: apache-2.0
dataset_info:
features:
- name: year
dtype: string
- name: day
dtype: string
- name: part
dtype: string
- name: pass
dtype: string
- name: fail
dtype: string
- name: test
dtype: string
- name: change
dtype: string
- name: i1
dtype: uint32
- name: i2
dtype: uint32
- name: j1
dtype: uint32
- name: j2
dtype: uint32
splits:
- name: train
num_bytes: 21469
num_examples: 15
download_size: 23847
dataset_size: 21469
---
# About the Dataset
This dataset is inspired by [HumanEval](https://github.com/openai/human-eval)
The source code used to generate the dataset can be found on [GitHub](https://github.com/alexjercan/bug-detection/tree/master/aoc-dataset)
A collection of submissions for the Advent of Code challenge.
This repository contains both passing and failing submissions.
This dataset is similar to [BugNet](https://huggingface.co/datasets/alexjercan/bugnet),
however it is meant to be used as an evaluation dataset.
The resulting dataset file will be a csv with the following columns:
- `year`: Used to identify the submission
- `day`: Used to identify the submission
- `part`: Used to identify the submission
- `fail`: The initial (buggy) source code formatted (`black`)
- `pass`: The modified (accepted) source code formatted (`black`)
- `change`: The change that was made (`replace`, `insert`, `delete`)
- `i1`: Start of the change in the buggy source (the line; starting with 1)
- `i2`: End of the change in the buggy source (not inclusive; for insert we have i1 == i2)
- `j1`: Start of the change in the accepted source (the line; starting with 1)
- `j2`: End of the change in the accepted source (not inclusive; for delete we have j1 == j2)
- `test`: The test case that can be used to evaluate the submission. |
NicholasSynovic/Victorian-Era-Authorship-Attribution | ---
language:
- en
pretty_name: Victorian Era Authorship Attribution Data Set
task_categories:
- text-classification
size_categories:
- 10K<n<100K
---
# Victorian Era Authorship Attribution Data Set
> GUNGOR, ABDULMECIT, Benchmarking Authorship Attribution Techniques Using Over A Thousand Books by Fifty Victorian Era Novelists, Purdue Master of Thesis, 2018-04
## NOTICE
This dataset was downloaded from the [UCI Machine Learning Repository](https://archive.ics.uci.edu/ml/index.php) at [this link](https://archive.ics.uci.edu/ml/datasets/Victorian+Era+Authorship+Attribution).
The [description](#description) of this dataset was copied from the source's dataset card. However, I have applied Markdown styling to prettify it and make it easier to navigate.
## Description
> **Abstract**: To create the largest authorship attribution dataset, we extracted works of 50 well-known authors. To have a non-exhaustive learning, in training there are 45 authors whereas, in the testing, it's 50
### Source
They're extracted from the GDELT database. The GDELT Project is an open platform for research and analysis of global society and thus all datasets released by the GDELT Project are available for unlimited and unrestricted use for any academic, commercial, or governmental use of any kind without fee.
### Data Set Information
To decrease the bias and create a reliable authorship attribution dataset the following criteria have been chosen to filter out authors in Gdelt database: English language writing authors, authors that have enough books available (at least 5), 19th century authors. With these criteria 50 authors have been selected and their books were queried through Big Query Gdelt database. The next task has been cleaning the dataset due to OCR reading problems in the original raw form. To achieve that, firstly all books have been scanned through to get the overall number of unique words and each words frequencies. While scanning the texts, the first 500 words and the last 500 words have been removed to take out specific features such as the name of the author, the name of the book and other word specific features that could make the classification task easier. After this step, we have chosen top 10,000 words that occurred in the whole 50 authors text data corpus. The words that are not in top 10,000 words were removed while keeping the rest of the sentence structure intact. The entire book is split into text fragments with 1000 words each. We separately maintained author and book identification number for each one of them in different arrays. Text segments with less than 1000 words were filled with zeros to keep them in the dataset as well. 1000 words make approximately 2 pages of writing, which is long enough to extract a variety of features from the document. Each instance in the training set consists of a text piece of 1000 words and an author id attached. In the testing set, there is only the text piece of 1000 words to do authorship attribution. Training data consists of 45 authors and testing data has 50 information. %34 of testing data is the percentile of unknown authors in the testing set.
### Attribute Information
Each instance consists of 1000 word sequences that are divided from the works of every author's book. In the training, the author id is also provided.
### Relevant Papers
* E. Stamatatos, A Survey of Modern Authorship Attribution Methods. Journal of the American Society for Information Science and Technology, 2009.
## Citation Request:
* `GUNGOR, ABDULMECIT, Benchmarking Authorship Attribution Techniques Using Over A Thousand Books by Fifty Victorian Era Novelists, Purdue Master of Thesis, 2018-04` |
open-llm-leaderboard/details_fhai50032__Mistral-4B-FT-2 | ---
pretty_name: Evaluation run of fhai50032/Mistral-4B-FT-2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [fhai50032/Mistral-4B-FT-2](https://huggingface.co/fhai50032/Mistral-4B-FT-2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_fhai50032__Mistral-4B-FT-2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-21T16:04:01.032636](https://huggingface.co/datasets/open-llm-leaderboard/details_fhai50032__Mistral-4B-FT-2/blob/main/results_2024-03-21T16-04-01.032636.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.256545358914205,\n\
\ \"acc_stderr\": 0.030693487949608498,\n \"acc_norm\": 0.2570833293895939,\n\
\ \"acc_norm_stderr\": 0.03143794023974341,\n \"mc1\": 0.2717258261933905,\n\
\ \"mc1_stderr\": 0.01557284045287583,\n \"mc2\": 0.4633447036093237,\n\
\ \"mc2_stderr\": 0.01515939809395832\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.21075085324232082,\n \"acc_stderr\": 0.011918271754852185,\n\
\ \"acc_norm\": 0.2593856655290102,\n \"acc_norm_stderr\": 0.012808273573927094\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.3316072495518821,\n\
\ \"acc_stderr\": 0.0046982853500192375,\n \"acc_norm\": 0.3963353913563035,\n\
\ \"acc_norm_stderr\": 0.004881359589148991\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.24444444444444444,\n\
\ \"acc_stderr\": 0.03712537833614866,\n \"acc_norm\": 0.24444444444444444,\n\
\ \"acc_norm_stderr\": 0.03712537833614866\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.2236842105263158,\n \"acc_stderr\": 0.03391160934343602,\n\
\ \"acc_norm\": 0.2236842105263158,\n \"acc_norm_stderr\": 0.03391160934343602\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.22,\n\
\ \"acc_stderr\": 0.041633319989322695,\n \"acc_norm\": 0.22,\n \
\ \"acc_norm_stderr\": 0.041633319989322695\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.26037735849056604,\n \"acc_stderr\": 0.027008766090708094,\n\
\ \"acc_norm\": 0.26037735849056604,\n \"acc_norm_stderr\": 0.027008766090708094\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2152777777777778,\n\
\ \"acc_stderr\": 0.03437079344106135,\n \"acc_norm\": 0.2152777777777778,\n\
\ \"acc_norm_stderr\": 0.03437079344106135\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.15,\n \"acc_stderr\": 0.03588702812826372,\n \"acc_norm\"\
: 0.15,\n \"acc_norm_stderr\": 0.03588702812826372\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n\
\ \"acc_stderr\": 0.0309528902177499,\n \"acc_norm\": 0.20809248554913296,\n\
\ \"acc_norm_stderr\": 0.0309528902177499\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.043898699568087785,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.043898699568087785\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.17,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.17,\n\
\ \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3191489361702128,\n \"acc_stderr\": 0.030472973363380042,\n\
\ \"acc_norm\": 0.3191489361702128,\n \"acc_norm_stderr\": 0.030472973363380042\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.22807017543859648,\n\
\ \"acc_stderr\": 0.03947152782669415,\n \"acc_norm\": 0.22807017543859648,\n\
\ \"acc_norm_stderr\": 0.03947152782669415\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.22758620689655173,\n \"acc_stderr\": 0.03493950380131183,\n\
\ \"acc_norm\": 0.22758620689655173,\n \"acc_norm_stderr\": 0.03493950380131183\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"\
acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.19047619047619047,\n\
\ \"acc_stderr\": 0.03512207412302052,\n \"acc_norm\": 0.19047619047619047,\n\
\ \"acc_norm_stderr\": 0.03512207412302052\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3032258064516129,\n\
\ \"acc_stderr\": 0.02614868593067175,\n \"acc_norm\": 0.3032258064516129,\n\
\ \"acc_norm_stderr\": 0.02614868593067175\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2512315270935961,\n \"acc_stderr\": 0.030516530732694433,\n\
\ \"acc_norm\": 0.2512315270935961,\n \"acc_norm_stderr\": 0.030516530732694433\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.19,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\"\
: 0.19,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.24242424242424243,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.24242424242424243,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.2222222222222222,\n \"acc_stderr\": 0.02962022787479049,\n \"\
acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.02962022787479049\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.36787564766839376,\n \"acc_stderr\": 0.034801756684660366,\n\
\ \"acc_norm\": 0.36787564766839376,\n \"acc_norm_stderr\": 0.034801756684660366\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2564102564102564,\n \"acc_stderr\": 0.022139081103971524,\n\
\ \"acc_norm\": 0.2564102564102564,\n \"acc_norm_stderr\": 0.022139081103971524\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3074074074074074,\n \"acc_stderr\": 0.028133252578815635,\n \
\ \"acc_norm\": 0.3074074074074074,\n \"acc_norm_stderr\": 0.028133252578815635\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.027553614467863804,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.027553614467863804\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2052980132450331,\n \"acc_stderr\": 0.03297986648473835,\n \"\
acc_norm\": 0.2052980132450331,\n \"acc_norm_stderr\": 0.03297986648473835\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.23302752293577983,\n \"acc_stderr\": 0.018125669180861507,\n \"\
acc_norm\": 0.23302752293577983,\n \"acc_norm_stderr\": 0.018125669180861507\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.37037037037037035,\n \"acc_stderr\": 0.03293377139415191,\n \"\
acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.03293377139415191\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2549019607843137,\n \"acc_stderr\": 0.030587591351604236,\n \"\
acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.030587591351604236\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.24472573839662448,\n \"acc_stderr\": 0.027985699387036423,\n \
\ \"acc_norm\": 0.24472573839662448,\n \"acc_norm_stderr\": 0.027985699387036423\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.38565022421524664,\n\
\ \"acc_stderr\": 0.03266842214289201,\n \"acc_norm\": 0.38565022421524664,\n\
\ \"acc_norm_stderr\": 0.03266842214289201\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2366412213740458,\n \"acc_stderr\": 0.037276735755969195,\n\
\ \"acc_norm\": 0.2366412213740458,\n \"acc_norm_stderr\": 0.037276735755969195\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.256198347107438,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\"\
: 0.256198347107438,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25,\n \
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22699386503067484,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.22699386503067484,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04287858751340455,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04287858751340455\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.22330097087378642,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.22330097087378642,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.25213675213675213,\n\
\ \"acc_stderr\": 0.02844796547623101,\n \"acc_norm\": 0.25213675213675213,\n\
\ \"acc_norm_stderr\": 0.02844796547623101\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2554278416347382,\n\
\ \"acc_stderr\": 0.015594955384455768,\n \"acc_norm\": 0.2554278416347382,\n\
\ \"acc_norm_stderr\": 0.015594955384455768\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.23699421965317918,\n \"acc_stderr\": 0.02289408248992599,\n\
\ \"acc_norm\": 0.23699421965317918,\n \"acc_norm_stderr\": 0.02289408248992599\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2581699346405229,\n \"acc_stderr\": 0.025058503316958157,\n\
\ \"acc_norm\": 0.2581699346405229,\n \"acc_norm_stderr\": 0.025058503316958157\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2572347266881029,\n\
\ \"acc_stderr\": 0.024826171289250888,\n \"acc_norm\": 0.2572347266881029,\n\
\ \"acc_norm_stderr\": 0.024826171289250888\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2623456790123457,\n \"acc_stderr\": 0.024477222856135114,\n\
\ \"acc_norm\": 0.2623456790123457,\n \"acc_norm_stderr\": 0.024477222856135114\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2553191489361702,\n \"acc_stderr\": 0.02601199293090201,\n \
\ \"acc_norm\": 0.2553191489361702,\n \"acc_norm_stderr\": 0.02601199293090201\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2405475880052151,\n\
\ \"acc_stderr\": 0.010916406735478949,\n \"acc_norm\": 0.2405475880052151,\n\
\ \"acc_norm_stderr\": 0.010916406735478949\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.39338235294117646,\n \"acc_stderr\": 0.02967428828131118,\n\
\ \"acc_norm\": 0.39338235294117646,\n \"acc_norm_stderr\": 0.02967428828131118\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.24183006535947713,\n \"acc_stderr\": 0.017322789207784326,\n \
\ \"acc_norm\": 0.24183006535947713,\n \"acc_norm_stderr\": 0.017322789207784326\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.3181818181818182,\n\
\ \"acc_stderr\": 0.044612721759105065,\n \"acc_norm\": 0.3181818181818182,\n\
\ \"acc_norm_stderr\": 0.044612721759105065\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.363265306122449,\n \"acc_stderr\": 0.030789051139030806,\n\
\ \"acc_norm\": 0.363265306122449,\n \"acc_norm_stderr\": 0.030789051139030806\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2537313432835821,\n\
\ \"acc_stderr\": 0.03076944496729602,\n \"acc_norm\": 0.2537313432835821,\n\
\ \"acc_norm_stderr\": 0.03076944496729602\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3132530120481928,\n\
\ \"acc_stderr\": 0.036108050180310235,\n \"acc_norm\": 0.3132530120481928,\n\
\ \"acc_norm_stderr\": 0.036108050180310235\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.0312678171466318,\n\
\ \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.0312678171466318\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2717258261933905,\n\
\ \"mc1_stderr\": 0.01557284045287583,\n \"mc2\": 0.4633447036093237,\n\
\ \"mc2_stderr\": 0.01515939809395832\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5659037095501184,\n \"acc_stderr\": 0.013929882555694058\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.02880970432145565,\n \
\ \"acc_stderr\": 0.004607484283767466\n }\n}\n```"
repo_url: https://huggingface.co/fhai50032/Mistral-4B-FT-2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_21T16_04_01.032636
path:
- '**/details_harness|arc:challenge|25_2024-03-21T16-04-01.032636.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-21T16-04-01.032636.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_21T16_04_01.032636
path:
- '**/details_harness|gsm8k|5_2024-03-21T16-04-01.032636.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-21T16-04-01.032636.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_21T16_04_01.032636
path:
- '**/details_harness|hellaswag|10_2024-03-21T16-04-01.032636.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-21T16-04-01.032636.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_21T16_04_01.032636
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T16-04-01.032636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T16-04-01.032636.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T16-04-01.032636.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_21T16_04_01.032636
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T16-04-01.032636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T16-04-01.032636.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_21T16_04_01.032636
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T16-04-01.032636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T16-04-01.032636.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_21T16_04_01.032636
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T16-04-01.032636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T16-04-01.032636.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_21T16_04_01.032636
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T16-04-01.032636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T16-04-01.032636.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_21T16_04_01.032636
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T16-04-01.032636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T16-04-01.032636.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_21T16_04_01.032636
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T16-04-01.032636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T16-04-01.032636.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_21T16_04_01.032636
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T16-04-01.032636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T16-04-01.032636.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_21T16_04_01.032636
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T16-04-01.032636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T16-04-01.032636.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_21T16_04_01.032636
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T16-04-01.032636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T16-04-01.032636.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_21T16_04_01.032636
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T16-04-01.032636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T16-04-01.032636.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_21T16_04_01.032636
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T16-04-01.032636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T16-04-01.032636.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_21T16_04_01.032636
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T16-04-01.032636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T16-04-01.032636.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_21T16_04_01.032636
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T16-04-01.032636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T16-04-01.032636.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_21T16_04_01.032636
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T16-04-01.032636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T16-04-01.032636.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_21T16_04_01.032636
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T16-04-01.032636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T16-04-01.032636.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_21T16_04_01.032636
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T16-04-01.032636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T16-04-01.032636.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_21T16_04_01.032636
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T16-04-01.032636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T16-04-01.032636.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_21T16_04_01.032636
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T16-04-01.032636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T16-04-01.032636.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_21T16_04_01.032636
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T16-04-01.032636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T16-04-01.032636.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_21T16_04_01.032636
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T16-04-01.032636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T16-04-01.032636.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_21T16_04_01.032636
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T16-04-01.032636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T16-04-01.032636.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_21T16_04_01.032636
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T16-04-01.032636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T16-04-01.032636.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_21T16_04_01.032636
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T16-04-01.032636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T16-04-01.032636.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_21T16_04_01.032636
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T16-04-01.032636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T16-04-01.032636.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_21T16_04_01.032636
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T16-04-01.032636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T16-04-01.032636.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_21T16_04_01.032636
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T16-04-01.032636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T16-04-01.032636.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_21T16_04_01.032636
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T16-04-01.032636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T16-04-01.032636.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_21T16_04_01.032636
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T16-04-01.032636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T16-04-01.032636.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_21T16_04_01.032636
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T16-04-01.032636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T16-04-01.032636.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_21T16_04_01.032636
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T16-04-01.032636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T16-04-01.032636.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_21T16_04_01.032636
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T16-04-01.032636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T16-04-01.032636.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_21T16_04_01.032636
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T16-04-01.032636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T16-04-01.032636.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_21T16_04_01.032636
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T16-04-01.032636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T16-04-01.032636.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_21T16_04_01.032636
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T16-04-01.032636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T16-04-01.032636.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_21T16_04_01.032636
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T16-04-01.032636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T16-04-01.032636.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_21T16_04_01.032636
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T16-04-01.032636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T16-04-01.032636.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_21T16_04_01.032636
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T16-04-01.032636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T16-04-01.032636.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_21T16_04_01.032636
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T16-04-01.032636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T16-04-01.032636.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_21T16_04_01.032636
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-21T16-04-01.032636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-21T16-04-01.032636.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_21T16_04_01.032636
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T16-04-01.032636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T16-04-01.032636.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_21T16_04_01.032636
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T16-04-01.032636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T16-04-01.032636.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_21T16_04_01.032636
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T16-04-01.032636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T16-04-01.032636.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_21T16_04_01.032636
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T16-04-01.032636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T16-04-01.032636.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_21T16_04_01.032636
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T16-04-01.032636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T16-04-01.032636.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_21T16_04_01.032636
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T16-04-01.032636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T16-04-01.032636.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_21T16_04_01.032636
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T16-04-01.032636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T16-04-01.032636.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_21T16_04_01.032636
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T16-04-01.032636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T16-04-01.032636.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_21T16_04_01.032636
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T16-04-01.032636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T16-04-01.032636.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_21T16_04_01.032636
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T16-04-01.032636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T16-04-01.032636.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_21T16_04_01.032636
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T16-04-01.032636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T16-04-01.032636.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_21T16_04_01.032636
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T16-04-01.032636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T16-04-01.032636.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_21T16_04_01.032636
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T16-04-01.032636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T16-04-01.032636.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_21T16_04_01.032636
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T16-04-01.032636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T16-04-01.032636.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_21T16_04_01.032636
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T16-04-01.032636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T16-04-01.032636.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_21T16_04_01.032636
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T16-04-01.032636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T16-04-01.032636.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_21T16_04_01.032636
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T16-04-01.032636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T16-04-01.032636.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_21T16_04_01.032636
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T16-04-01.032636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T16-04-01.032636.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_21T16_04_01.032636
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-21T16-04-01.032636.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-21T16-04-01.032636.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_21T16_04_01.032636
path:
- '**/details_harness|winogrande|5_2024-03-21T16-04-01.032636.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-21T16-04-01.032636.parquet'
- config_name: results
data_files:
- split: 2024_03_21T16_04_01.032636
path:
- results_2024-03-21T16-04-01.032636.parquet
- split: latest
path:
- results_2024-03-21T16-04-01.032636.parquet
---
# Dataset Card for Evaluation run of fhai50032/Mistral-4B-FT-2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [fhai50032/Mistral-4B-FT-2](https://huggingface.co/fhai50032/Mistral-4B-FT-2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_fhai50032__Mistral-4B-FT-2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-21T16:04:01.032636](https://huggingface.co/datasets/open-llm-leaderboard/details_fhai50032__Mistral-4B-FT-2/blob/main/results_2024-03-21T16-04-01.032636.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.256545358914205,
"acc_stderr": 0.030693487949608498,
"acc_norm": 0.2570833293895939,
"acc_norm_stderr": 0.03143794023974341,
"mc1": 0.2717258261933905,
"mc1_stderr": 0.01557284045287583,
"mc2": 0.4633447036093237,
"mc2_stderr": 0.01515939809395832
},
"harness|arc:challenge|25": {
"acc": 0.21075085324232082,
"acc_stderr": 0.011918271754852185,
"acc_norm": 0.2593856655290102,
"acc_norm_stderr": 0.012808273573927094
},
"harness|hellaswag|10": {
"acc": 0.3316072495518821,
"acc_stderr": 0.0046982853500192375,
"acc_norm": 0.3963353913563035,
"acc_norm_stderr": 0.004881359589148991
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.24444444444444444,
"acc_stderr": 0.03712537833614866,
"acc_norm": 0.24444444444444444,
"acc_norm_stderr": 0.03712537833614866
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2236842105263158,
"acc_stderr": 0.03391160934343602,
"acc_norm": 0.2236842105263158,
"acc_norm_stderr": 0.03391160934343602
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.26037735849056604,
"acc_stderr": 0.027008766090708094,
"acc_norm": 0.26037735849056604,
"acc_norm_stderr": 0.027008766090708094
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2152777777777778,
"acc_stderr": 0.03437079344106135,
"acc_norm": 0.2152777777777778,
"acc_norm_stderr": 0.03437079344106135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.15,
"acc_stderr": 0.03588702812826372,
"acc_norm": 0.15,
"acc_norm_stderr": 0.03588702812826372
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.0309528902177499,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.0309528902177499
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.043898699568087785,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.043898699568087785
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.17,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.17,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3191489361702128,
"acc_stderr": 0.030472973363380042,
"acc_norm": 0.3191489361702128,
"acc_norm_stderr": 0.030472973363380042
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.22807017543859648,
"acc_stderr": 0.03947152782669415,
"acc_norm": 0.22807017543859648,
"acc_norm_stderr": 0.03947152782669415
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.22758620689655173,
"acc_stderr": 0.03493950380131183,
"acc_norm": 0.22758620689655173,
"acc_norm_stderr": 0.03493950380131183
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.19047619047619047,
"acc_stderr": 0.03512207412302052,
"acc_norm": 0.19047619047619047,
"acc_norm_stderr": 0.03512207412302052
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3032258064516129,
"acc_stderr": 0.02614868593067175,
"acc_norm": 0.3032258064516129,
"acc_norm_stderr": 0.02614868593067175
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2512315270935961,
"acc_stderr": 0.030516530732694433,
"acc_norm": 0.2512315270935961,
"acc_norm_stderr": 0.030516530732694433
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.19,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.19,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24242424242424243,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.24242424242424243,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.02962022787479049,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.02962022787479049
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.36787564766839376,
"acc_stderr": 0.034801756684660366,
"acc_norm": 0.36787564766839376,
"acc_norm_stderr": 0.034801756684660366
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2564102564102564,
"acc_stderr": 0.022139081103971524,
"acc_norm": 0.2564102564102564,
"acc_norm_stderr": 0.022139081103971524
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3074074074074074,
"acc_stderr": 0.028133252578815635,
"acc_norm": 0.3074074074074074,
"acc_norm_stderr": 0.028133252578815635
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.027553614467863804,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.027553614467863804
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2052980132450331,
"acc_stderr": 0.03297986648473835,
"acc_norm": 0.2052980132450331,
"acc_norm_stderr": 0.03297986648473835
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.23302752293577983,
"acc_stderr": 0.018125669180861507,
"acc_norm": 0.23302752293577983,
"acc_norm_stderr": 0.018125669180861507
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.03293377139415191,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.03293377139415191
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.030587591351604236,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.030587591351604236
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.24472573839662448,
"acc_stderr": 0.027985699387036423,
"acc_norm": 0.24472573839662448,
"acc_norm_stderr": 0.027985699387036423
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.38565022421524664,
"acc_stderr": 0.03266842214289201,
"acc_norm": 0.38565022421524664,
"acc_norm_stderr": 0.03266842214289201
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2366412213740458,
"acc_stderr": 0.037276735755969195,
"acc_norm": 0.2366412213740458,
"acc_norm_stderr": 0.037276735755969195
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.256198347107438,
"acc_stderr": 0.03984979653302872,
"acc_norm": 0.256198347107438,
"acc_norm_stderr": 0.03984979653302872
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22699386503067484,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.22699386503067484,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04287858751340455,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04287858751340455
},
"harness|hendrycksTest-management|5": {
"acc": 0.22330097087378642,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.22330097087378642,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.25213675213675213,
"acc_stderr": 0.02844796547623101,
"acc_norm": 0.25213675213675213,
"acc_norm_stderr": 0.02844796547623101
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2554278416347382,
"acc_stderr": 0.015594955384455768,
"acc_norm": 0.2554278416347382,
"acc_norm_stderr": 0.015594955384455768
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.23699421965317918,
"acc_stderr": 0.02289408248992599,
"acc_norm": 0.23699421965317918,
"acc_norm_stderr": 0.02289408248992599
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2581699346405229,
"acc_stderr": 0.025058503316958157,
"acc_norm": 0.2581699346405229,
"acc_norm_stderr": 0.025058503316958157
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2572347266881029,
"acc_stderr": 0.024826171289250888,
"acc_norm": 0.2572347266881029,
"acc_norm_stderr": 0.024826171289250888
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2623456790123457,
"acc_stderr": 0.024477222856135114,
"acc_norm": 0.2623456790123457,
"acc_norm_stderr": 0.024477222856135114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2553191489361702,
"acc_stderr": 0.02601199293090201,
"acc_norm": 0.2553191489361702,
"acc_norm_stderr": 0.02601199293090201
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2405475880052151,
"acc_stderr": 0.010916406735478949,
"acc_norm": 0.2405475880052151,
"acc_norm_stderr": 0.010916406735478949
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.39338235294117646,
"acc_stderr": 0.02967428828131118,
"acc_norm": 0.39338235294117646,
"acc_norm_stderr": 0.02967428828131118
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.24183006535947713,
"acc_stderr": 0.017322789207784326,
"acc_norm": 0.24183006535947713,
"acc_norm_stderr": 0.017322789207784326
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.3181818181818182,
"acc_stderr": 0.044612721759105065,
"acc_norm": 0.3181818181818182,
"acc_norm_stderr": 0.044612721759105065
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.363265306122449,
"acc_stderr": 0.030789051139030806,
"acc_norm": 0.363265306122449,
"acc_norm_stderr": 0.030789051139030806
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.2537313432835821,
"acc_stderr": 0.03076944496729602,
"acc_norm": 0.2537313432835821,
"acc_norm_stderr": 0.03076944496729602
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3132530120481928,
"acc_stderr": 0.036108050180310235,
"acc_norm": 0.3132530120481928,
"acc_norm_stderr": 0.036108050180310235
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.0312678171466318,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.0312678171466318
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2717258261933905,
"mc1_stderr": 0.01557284045287583,
"mc2": 0.4633447036093237,
"mc2_stderr": 0.01515939809395832
},
"harness|winogrande|5": {
"acc": 0.5659037095501184,
"acc_stderr": 0.013929882555694058
},
"harness|gsm8k|5": {
"acc": 0.02880970432145565,
"acc_stderr": 0.004607484283767466
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
VivendoDigital/pescarai-wiki-full | ---
license: apache-2.0
dataset_info:
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: query_count
dtype: int64
splits:
- name: train
num_bytes: 60073614
num_examples: 5580
download_size: 34433024
dataset_size: 60073614
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
smhdigital/smh-tiny-gpt2_bert-fine-tuned | ---
license: other
---
|
dst19/UltraSharp | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_KnutJaegersberg__Walter-SOLAR-11B | ---
pretty_name: Evaluation run of KnutJaegersberg/Walter-SOLAR-11B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [KnutJaegersberg/Walter-SOLAR-11B](https://huggingface.co/KnutJaegersberg/Walter-SOLAR-11B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KnutJaegersberg__Walter-SOLAR-11B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-16T17:23:07.067772](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__Walter-SOLAR-11B/blob/main/results_2023-12-16T17-23-07.067772.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6404707310339822,\n\
\ \"acc_stderr\": 0.0318661190240921,\n \"acc_norm\": 0.6524926684204396,\n\
\ \"acc_norm_stderr\": 0.03268492668160191,\n \"mc1\": 0.29253365973072215,\n\
\ \"mc1_stderr\": 0.015925597445286165,\n \"mc2\": 0.4487960219023809,\n\
\ \"mc2_stderr\": 0.014224892990272523\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5639931740614335,\n \"acc_stderr\": 0.014491225699230918,\n\
\ \"acc_norm\": 0.6040955631399317,\n \"acc_norm_stderr\": 0.01429122839353659\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6549492133041227,\n\
\ \"acc_stderr\": 0.004744132825391527,\n \"acc_norm\": 0.848635729934276,\n\
\ \"acc_norm_stderr\": 0.003576711065619589\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \
\ \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"\
acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7368421052631579,\n \"acc_stderr\": 0.03583496176361074,\n\
\ \"acc_norm\": 0.7368421052631579,\n \"acc_norm_stderr\": 0.03583496176361074\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.66,\n\
\ \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \
\ \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.028815615713432108,\n\
\ \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.028815615713432108\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.41,\n\
\ \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.03643037168958548,\n\
\ \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.03643037168958548\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.04690650298201943,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.04690650298201943\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.574468085106383,\n\
\ \"acc_stderr\": 0.03232146916224468,\n \"acc_norm\": 0.574468085106383,\n\
\ \"acc_norm_stderr\": 0.03232146916224468\n },\n \"harness|hendrycksTest-econometrics|5\"\
: {\n \"acc\": 0.43859649122807015,\n \"acc_stderr\": 0.04668000738510455,\n\
\ \"acc_norm\": 0.43859649122807015,\n \"acc_norm_stderr\": 0.04668000738510455\n\
\ },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\"\
: 0.6068965517241379,\n \"acc_stderr\": 0.040703290137070705,\n \"\
acc_norm\": 0.6068965517241379,\n \"acc_norm_stderr\": 0.040703290137070705\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4312169312169312,\n \"acc_stderr\": 0.0255064816981382,\n \"acc_norm\"\
: 0.4312169312169312,\n \"acc_norm_stderr\": 0.0255064816981382\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n\
\ \"acc_stderr\": 0.04403438954768176,\n \"acc_norm\": 0.4126984126984127,\n\
\ \"acc_norm_stderr\": 0.04403438954768176\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411019,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411019\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n\
\ \"acc_stderr\": 0.02341529343356852,\n \"acc_norm\": 0.7838709677419354,\n\
\ \"acc_norm_stderr\": 0.02341529343356852\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175007,\n\
\ \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175007\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8282828282828283,\n \"acc_stderr\": 0.02686971618742991,\n \"\
acc_norm\": 0.8282828282828283,\n \"acc_norm_stderr\": 0.02686971618742991\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.024233532297758723,\n\
\ \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.024233532297758723\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6461538461538462,\n \"acc_stderr\": 0.02424378399406216,\n \
\ \"acc_norm\": 0.6461538461538462,\n \"acc_norm_stderr\": 0.02424378399406216\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.37037037037037035,\n \"acc_stderr\": 0.02944316932303154,\n \
\ \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.02944316932303154\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \
\ \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.40397350993377484,\n \"acc_stderr\": 0.040064856853653415,\n \"\
acc_norm\": 0.40397350993377484,\n \"acc_norm_stderr\": 0.040064856853653415\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374296,\n \"\
acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374296\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5601851851851852,\n \"acc_stderr\": 0.033851779760448106,\n \"\
acc_norm\": 0.5601851851851852,\n \"acc_norm_stderr\": 0.033851779760448106\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931792,\n \"\
acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931792\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8438818565400844,\n \"acc_stderr\": 0.023627159460318667,\n \
\ \"acc_norm\": 0.8438818565400844,\n \"acc_norm_stderr\": 0.023627159460318667\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7130044843049327,\n\
\ \"acc_stderr\": 0.030360379710291954,\n \"acc_norm\": 0.7130044843049327,\n\
\ \"acc_norm_stderr\": 0.030360379710291954\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306086,\n\
\ \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306086\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990945,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990945\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6993865030674846,\n \"acc_stderr\": 0.03602511318806771,\n\
\ \"acc_norm\": 0.6993865030674846,\n \"acc_norm_stderr\": 0.03602511318806771\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.020930193185179333,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.020930193185179333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.822477650063857,\n\
\ \"acc_stderr\": 0.01366423099583483,\n \"acc_norm\": 0.822477650063857,\n\
\ \"acc_norm_stderr\": 0.01366423099583483\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069356,\n\
\ \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069356\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4145251396648045,\n\
\ \"acc_stderr\": 0.016476342210254,\n \"acc_norm\": 0.4145251396648045,\n\
\ \"acc_norm_stderr\": 0.016476342210254\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826507,\n\
\ \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826507\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7234726688102894,\n\
\ \"acc_stderr\": 0.02540383297817962,\n \"acc_norm\": 0.7234726688102894,\n\
\ \"acc_norm_stderr\": 0.02540383297817962\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n \
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\"\
: 0.5106382978723404,\n \"acc_stderr\": 0.02982074719142244,\n \"\
acc_norm\": 0.5106382978723404,\n \"acc_norm_stderr\": 0.02982074719142244\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4810951760104302,\n\
\ \"acc_stderr\": 0.012761104871472655,\n \"acc_norm\": 0.4810951760104302,\n\
\ \"acc_norm_stderr\": 0.012761104871472655\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.02833295951403121,\n\
\ \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.02833295951403121\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6683006535947712,\n \"acc_stderr\": 0.01904748523936038,\n \
\ \"acc_norm\": 0.6683006535947712,\n \"acc_norm_stderr\": 0.01904748523936038\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.02826388994378459,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.02826388994378459\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n\
\ \"acc_stderr\": 0.02519692987482707,\n \"acc_norm\": 0.8507462686567164,\n\
\ \"acc_norm_stderr\": 0.02519692987482707\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197768,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197768\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.029913127232368053,\n\
\ \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.029913127232368053\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.29253365973072215,\n\
\ \"mc1_stderr\": 0.015925597445286165,\n \"mc2\": 0.4487960219023809,\n\
\ \"mc2_stderr\": 0.014224892990272523\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7955801104972375,\n \"acc_stderr\": 0.011334090612597202\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.009855951478392721,\n \
\ \"acc_stderr\": 0.0027210765770416586\n }\n}\n```"
repo_url: https://huggingface.co/KnutJaegersberg/Walter-SOLAR-11B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_16T17_23_07.067772
path:
- '**/details_harness|arc:challenge|25_2023-12-16T17-23-07.067772.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-16T17-23-07.067772.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_16T17_23_07.067772
path:
- '**/details_harness|gsm8k|5_2023-12-16T17-23-07.067772.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-16T17-23-07.067772.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_16T17_23_07.067772
path:
- '**/details_harness|hellaswag|10_2023-12-16T17-23-07.067772.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-16T17-23-07.067772.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_16T17_23_07.067772
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T17-23-07.067772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-16T17-23-07.067772.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T17-23-07.067772.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_16T17_23_07.067772
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T17-23-07.067772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T17-23-07.067772.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_16T17_23_07.067772
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T17-23-07.067772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T17-23-07.067772.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_16T17_23_07.067772
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T17-23-07.067772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T17-23-07.067772.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_16T17_23_07.067772
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T17-23-07.067772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T17-23-07.067772.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_16T17_23_07.067772
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T17-23-07.067772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T17-23-07.067772.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_16T17_23_07.067772
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T17-23-07.067772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T17-23-07.067772.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_16T17_23_07.067772
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T17-23-07.067772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T17-23-07.067772.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_16T17_23_07.067772
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T17-23-07.067772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T17-23-07.067772.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_16T17_23_07.067772
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T17-23-07.067772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T17-23-07.067772.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_16T17_23_07.067772
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T17-23-07.067772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T17-23-07.067772.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_16T17_23_07.067772
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T17-23-07.067772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T17-23-07.067772.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_16T17_23_07.067772
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T17-23-07.067772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T17-23-07.067772.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_16T17_23_07.067772
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T17-23-07.067772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T17-23-07.067772.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_16T17_23_07.067772
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T17-23-07.067772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T17-23-07.067772.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_16T17_23_07.067772
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T17-23-07.067772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T17-23-07.067772.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_16T17_23_07.067772
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T17-23-07.067772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T17-23-07.067772.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_16T17_23_07.067772
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T17-23-07.067772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T17-23-07.067772.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_16T17_23_07.067772
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T17-23-07.067772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T17-23-07.067772.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_16T17_23_07.067772
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T17-23-07.067772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T17-23-07.067772.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_16T17_23_07.067772
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T17-23-07.067772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T17-23-07.067772.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_16T17_23_07.067772
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T17-23-07.067772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T17-23-07.067772.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_16T17_23_07.067772
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T17-23-07.067772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T17-23-07.067772.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_16T17_23_07.067772
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T17-23-07.067772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T17-23-07.067772.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_16T17_23_07.067772
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T17-23-07.067772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T17-23-07.067772.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_16T17_23_07.067772
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T17-23-07.067772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T17-23-07.067772.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_16T17_23_07.067772
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T17-23-07.067772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T17-23-07.067772.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_16T17_23_07.067772
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T17-23-07.067772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T17-23-07.067772.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_16T17_23_07.067772
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T17-23-07.067772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T17-23-07.067772.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_16T17_23_07.067772
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T17-23-07.067772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T17-23-07.067772.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_16T17_23_07.067772
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T17-23-07.067772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T17-23-07.067772.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_16T17_23_07.067772
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T17-23-07.067772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T17-23-07.067772.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_16T17_23_07.067772
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T17-23-07.067772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T17-23-07.067772.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_16T17_23_07.067772
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T17-23-07.067772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T17-23-07.067772.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_16T17_23_07.067772
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T17-23-07.067772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T17-23-07.067772.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_16T17_23_07.067772
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-16T17-23-07.067772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-16T17-23-07.067772.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_16T17_23_07.067772
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T17-23-07.067772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T17-23-07.067772.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_16T17_23_07.067772
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T17-23-07.067772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T17-23-07.067772.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_16T17_23_07.067772
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T17-23-07.067772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T17-23-07.067772.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_16T17_23_07.067772
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-16T17-23-07.067772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-16T17-23-07.067772.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_16T17_23_07.067772
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-16T17-23-07.067772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-16T17-23-07.067772.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_16T17_23_07.067772
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T17-23-07.067772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T17-23-07.067772.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_16T17_23_07.067772
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T17-23-07.067772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T17-23-07.067772.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_16T17_23_07.067772
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T17-23-07.067772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T17-23-07.067772.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_16T17_23_07.067772
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T17-23-07.067772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T17-23-07.067772.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_16T17_23_07.067772
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T17-23-07.067772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T17-23-07.067772.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_16T17_23_07.067772
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T17-23-07.067772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T17-23-07.067772.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_16T17_23_07.067772
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T17-23-07.067772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T17-23-07.067772.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_16T17_23_07.067772
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T17-23-07.067772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T17-23-07.067772.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_16T17_23_07.067772
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T17-23-07.067772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T17-23-07.067772.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_16T17_23_07.067772
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T17-23-07.067772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T17-23-07.067772.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_16T17_23_07.067772
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T17-23-07.067772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T17-23-07.067772.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_16T17_23_07.067772
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T17-23-07.067772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T17-23-07.067772.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_16T17_23_07.067772
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T17-23-07.067772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T17-23-07.067772.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_16T17_23_07.067772
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-16T17-23-07.067772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-16T17-23-07.067772.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_16T17_23_07.067772
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T17-23-07.067772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T17-23-07.067772.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_16T17_23_07.067772
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-16T17-23-07.067772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-16T17-23-07.067772.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_16T17_23_07.067772
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T17-23-07.067772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T17-23-07.067772.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_16T17_23_07.067772
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-16T17-23-07.067772.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-16T17-23-07.067772.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_16T17_23_07.067772
path:
- '**/details_harness|winogrande|5_2023-12-16T17-23-07.067772.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-16T17-23-07.067772.parquet'
- config_name: results
data_files:
- split: 2023_12_16T17_23_07.067772
path:
- results_2023-12-16T17-23-07.067772.parquet
- split: latest
path:
- results_2023-12-16T17-23-07.067772.parquet
---
# Dataset Card for Evaluation run of KnutJaegersberg/Walter-SOLAR-11B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [KnutJaegersberg/Walter-SOLAR-11B](https://huggingface.co/KnutJaegersberg/Walter-SOLAR-11B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_KnutJaegersberg__Walter-SOLAR-11B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-16T17:23:07.067772](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__Walter-SOLAR-11B/blob/main/results_2023-12-16T17-23-07.067772.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6404707310339822,
"acc_stderr": 0.0318661190240921,
"acc_norm": 0.6524926684204396,
"acc_norm_stderr": 0.03268492668160191,
"mc1": 0.29253365973072215,
"mc1_stderr": 0.015925597445286165,
"mc2": 0.4487960219023809,
"mc2_stderr": 0.014224892990272523
},
"harness|arc:challenge|25": {
"acc": 0.5639931740614335,
"acc_stderr": 0.014491225699230918,
"acc_norm": 0.6040955631399317,
"acc_norm_stderr": 0.01429122839353659
},
"harness|hellaswag|10": {
"acc": 0.6549492133041227,
"acc_stderr": 0.004744132825391527,
"acc_norm": 0.848635729934276,
"acc_norm_stderr": 0.003576711065619589
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7368421052631579,
"acc_stderr": 0.03583496176361074,
"acc_norm": 0.7368421052631579,
"acc_norm_stderr": 0.03583496176361074
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6754716981132075,
"acc_stderr": 0.028815615713432108,
"acc_norm": 0.6754716981132075,
"acc_norm_stderr": 0.028815615713432108
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.03643037168958548,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.03643037168958548
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04690650298201943,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04690650298201943
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6068965517241379,
"acc_stderr": 0.040703290137070705,
"acc_norm": 0.6068965517241379,
"acc_norm_stderr": 0.040703290137070705
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4312169312169312,
"acc_stderr": 0.0255064816981382,
"acc_norm": 0.4312169312169312,
"acc_norm_stderr": 0.0255064816981382
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768176,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768176
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.02341529343356852,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.02341529343356852
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175007,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175007
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8282828282828283,
"acc_stderr": 0.02686971618742991,
"acc_norm": 0.8282828282828283,
"acc_norm_stderr": 0.02686971618742991
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8704663212435233,
"acc_stderr": 0.024233532297758723,
"acc_norm": 0.8704663212435233,
"acc_norm_stderr": 0.024233532297758723
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6461538461538462,
"acc_stderr": 0.02424378399406216,
"acc_norm": 0.6461538461538462,
"acc_norm_stderr": 0.02424378399406216
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.02944316932303154,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.02944316932303154
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.40397350993377484,
"acc_stderr": 0.040064856853653415,
"acc_norm": 0.40397350993377484,
"acc_norm_stderr": 0.040064856853653415
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374296,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374296
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5601851851851852,
"acc_stderr": 0.033851779760448106,
"acc_norm": 0.5601851851851852,
"acc_norm_stderr": 0.033851779760448106
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931792,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931792
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8438818565400844,
"acc_stderr": 0.023627159460318667,
"acc_norm": 0.8438818565400844,
"acc_norm_stderr": 0.023627159460318667
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7130044843049327,
"acc_stderr": 0.030360379710291954,
"acc_norm": 0.7130044843049327,
"acc_norm_stderr": 0.030360379710291954
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990945,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990945
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6993865030674846,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.6993865030674846,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.020930193185179333,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.020930193185179333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.822477650063857,
"acc_stderr": 0.01366423099583483,
"acc_norm": 0.822477650063857,
"acc_norm_stderr": 0.01366423099583483
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069356,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069356
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4145251396648045,
"acc_stderr": 0.016476342210254,
"acc_norm": 0.4145251396648045,
"acc_norm_stderr": 0.016476342210254
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826507,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826507
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7234726688102894,
"acc_stderr": 0.02540383297817962,
"acc_norm": 0.7234726688102894,
"acc_norm_stderr": 0.02540383297817962
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.75,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.75,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5106382978723404,
"acc_stderr": 0.02982074719142244,
"acc_norm": 0.5106382978723404,
"acc_norm_stderr": 0.02982074719142244
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4810951760104302,
"acc_stderr": 0.012761104871472655,
"acc_norm": 0.4810951760104302,
"acc_norm_stderr": 0.012761104871472655
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.02833295951403121,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.02833295951403121
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6683006535947712,
"acc_stderr": 0.01904748523936038,
"acc_norm": 0.6683006535947712,
"acc_norm_stderr": 0.01904748523936038
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.02826388994378459,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.02826388994378459
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.02519692987482707,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.02519692987482707
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197768,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197768
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.029913127232368053,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.029913127232368053
},
"harness|truthfulqa:mc|0": {
"mc1": 0.29253365973072215,
"mc1_stderr": 0.015925597445286165,
"mc2": 0.4487960219023809,
"mc2_stderr": 0.014224892990272523
},
"harness|winogrande|5": {
"acc": 0.7955801104972375,
"acc_stderr": 0.011334090612597202
},
"harness|gsm8k|5": {
"acc": 0.009855951478392721,
"acc_stderr": 0.0027210765770416586
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
cl-nagoya/jmarco | ---
dataset_info:
features:
- name: query_ja
dtype: string
- name: positive_ja
dtype: string
- name: negative_ja
dtype: string
- name: query_en
dtype: string
- name: positive_en
dtype: string
- name: negative_en
dtype: string
splits:
- name: train
num_bytes: 1704514
num_examples: 1000
download_size: 695047
dataset_size: 1704514
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
carnival13/xlmr_int_hard_curr_trn_ep3_corr | ---
dataset_info:
features:
- name: domain_label
dtype: int64
- name: pass_label
dtype: int64
- name: input
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 428103904
num_examples: 339150
download_size: 121213275
dataset_size: 428103904
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "xlmr_int_hard_curr_trn_ep3_corr"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
result-kand2-sdxl-wuerst-karlo/289673e1 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 168
num_examples: 10
download_size: 1327
dataset_size: 168
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "289673e1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tingchih/v1 | ---
dataset_info:
features:
- name: Documents
sequence: string
- name: Entailment_Claims
sequence: string
- name: Neutral_Claims
sequence: string
- name: Contradiction_Claims
sequence: string
- name: Summary_GT
dtype: string
splits:
- name: train
num_bytes: 521818685
num_examples: 35000
- name: test
num_bytes: 130314128
num_examples: 8450
download_size: 381452241
dataset_size: 652132813
---
# Dataset Card for "v1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
akoripelly/quadraticdata | ---
license: openrail
---
|
Saxo/total_ko_train_set_1_without_wiki | ---
license: apache-2.0
---
|
mehdidn/ner | ---
license: other
---
|
ORVC/OUltimate | ---
license: cc-by-4.0
---
|
yaqingwang90/LiST_CLUE | ---
license: mit
---
|
katxtong/tokenized_squad_size356 | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: start_positions
dtype: int64
- name: end_positions
dtype: int64
splits:
- name: train
num_bytes: 170292456
num_examples: 87599
- name: validation
num_bytes: 20548080
num_examples: 10570
download_size: 26850313
dataset_size: 190840536
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
DynamicSuperb/EnvironmentalSoundClassification_ESC50-Animals | ---
dataset_info:
features:
- name: file
dtype: string
- name: audio
dtype: audio
- name: label
dtype: string
- name: instruction
dtype: string
splits:
- name: test
num_bytes: 88244955.5
num_examples: 200
download_size: 77522123
dataset_size: 88244955.5
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
# Dataset Card for "environmental_sound_classification_animals_ESC50"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zeaver/multifactor_squad1.1_zhou | ---
license: mit
task_categories:
- text-generation
- question-answering
language:
- en
tags:
- question-generation
- HotpotQA
size_categories:
- 10K<n<100K
---
# MultiFactor-HotpotQA-SuppFacts
<!-- Provide a quick summary of the dataset. -->
The MultiFactor datasets -- SQuAD1.1-Zhou Split [1] in EMNLP 2023 Findings: [*Improving Question Generation with Multi-level Content Planning*](https://arxiv.org/abs/2310.13512).
## 1. Dataset Details
### 1.1 Dataset Description
SQuAD1.1-Zhou Split [1, 2] in EMNLP 2023 Findings: [*Improving Question Generation with Multi-level Content Planning*](https://arxiv.org/abs/2310.13512).
Based on the dataset in [2], we add the `p_hrase`, `n_phrase` and `full answer` attributes for every dataset instance.
The full answer is reconstructed with [QA2D](https://github.com/kelvinguu/qanli) [3]. More details are in paper github: https://github.com/zeaver/MultiFactor.
### 1.2 Dataset Sources
<!-- Provide the basic links for the dataset. -->
- **Repository:** https://github.com/zeaver/MultiFactor
- **Paper:** [*Improving Question Generation with Multi-level Content Planning*](https://arxiv.org/abs/2310.13512). EMNLP Findings, 2023.
## 2. Dataset Structure
```tex
.
├── dev.json
├── test.json
├── train.json
├── fa_model_inference
├── dev.json
├── test.json
└── train.json
```
Each split is a json file, not jsonl. Please load it with `json.load(f)` directly. And the dataset schema is:
```json
{
"context": "the given input context",
"answer": "the given answer",
"question": "the corresponding question",
"p_phrase": "the postive phrases in the given context",
"n_phrase": "the negative phrases",
"full answer": "pseudo-gold full answer (q + a -> a declarative sentence)",
}
```
We also provide the *FA_Model*'s inference results in `fa_model_inference/{split}.json`.
## 3. Dataset Card Contact
If you have any question, feel free to contact with me: zehua.xia1999@gmail.com
## Reference
[1] Rajpurkar, Pranav, et al. [SQuAD: 100,000+ Questions for Machine Comprehension of Text](https://aclanthology.org/D16-1264/). EMNLP, 2016.
[2] Zhou, Qingyu, et al. [Neural Question Generation from Text: A Preliminary Study](https://arxiv.org/abs/1704.01792). EMNLP, 2017.
[3] Demszky, Dorottya, et al. [Transforming Question Answering Datasets Into Natural Language Inference Datasets](https://arxiv.org/abs/1809.02922). Stanford University. arXiv, 2018. |
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-markdown-56000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 1065055
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
osanseviero/test_osan | ---
task_ids:
- automatic-speech-recognition
dataset_info:
features:
- name: CHANNEL_NAME
dtype: string
- name: URL
dtype: string
- name: TITLE
dtype: string
- name: DESCRIPTION
dtype: string
- name: TRANSCRIPTION
dtype: string
- name: SEGMENTS
dtype: string
splits:
- name: train
num_bytes: 28243998
num_examples: 375
download_size: 12872792
dataset_size: 28243998
tags:
- whisper
- whispering
---
# Dataset Card for "Yannic-Kilcher"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_mavihsrr__GetCode-slerp | ---
pretty_name: Evaluation run of mavihsrr/GetCode-slerp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [mavihsrr/GetCode-slerp](https://huggingface.co/mavihsrr/GetCode-slerp) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mavihsrr__GetCode-slerp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-16T00:21:39.795318](https://huggingface.co/datasets/open-llm-leaderboard/details_mavihsrr__GetCode-slerp/blob/main/results_2024-01-16T00-21-39.795318.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23210164902672914,\n\
\ \"acc_stderr\": 0.029927744191797015,\n \"acc_norm\": 0.23227062322049502,\n\
\ \"acc_norm_stderr\": 0.030723532683480458,\n \"mc1\": 0.24357405140758873,\n\
\ \"mc1_stderr\": 0.015026354824910782,\n \"mc2\": 0.4977872683391919,\n\
\ \"mc2_stderr\": 0.016733548639246566\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.20392491467576793,\n \"acc_stderr\": 0.011774262478702247,\n\
\ \"acc_norm\": 0.26535836177474403,\n \"acc_norm_stderr\": 0.012902554762313964\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2599083847839076,\n\
\ \"acc_stderr\": 0.00437687761923412,\n \"acc_norm\": 0.26199960167297354,\n\
\ \"acc_norm_stderr\": 0.004388237557526723\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.18518518518518517,\n\
\ \"acc_stderr\": 0.03355677216313142,\n \"acc_norm\": 0.18518518518518517,\n\
\ \"acc_norm_stderr\": 0.03355677216313142\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n\
\ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n\
\ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.02528839450289137,\n\
\ \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.02528839450289137\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n\
\ \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n\
\ \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n\
\ \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n\
\ \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"\
acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"\
acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n \"\
acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"\
acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n\
\ \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.20256410256410257,\n \"acc_stderr\": 0.020377660970371372,\n\
\ \"acc_norm\": 0.20256410256410257,\n \"acc_norm_stderr\": 0.020377660970371372\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \
\ \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n\
\ \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"\
acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"\
acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"\
acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n\
\ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n\
\ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n\
\ \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n\
\ \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23754789272030652,\n\
\ \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.23754789272030652,\n\
\ \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n\
\ \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n\
\ \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n\
\ \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \
\ \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n\
\ \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n\
\ \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n\
\ \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n\
\ \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n\
\ \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n\
\ \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n\
\ \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3216374269005848,\n\
\ \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.3216374269005848,\n\
\ \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.24357405140758873,\n \"mc1_stderr\": 0.015026354824910782,\n\
\ \"mc2\": 0.4977872683391919,\n \"mc2_stderr\": 0.016733548639246566\n\
\ },\n \"harness|winogrande|5\": {\n \"acc\": 0.5177584846093133,\n\
\ \"acc_stderr\": 0.014043619596174964\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```"
repo_url: https://huggingface.co/mavihsrr/GetCode-slerp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_16T00_21_39.795318
path:
- '**/details_harness|arc:challenge|25_2024-01-16T00-21-39.795318.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-16T00-21-39.795318.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_16T00_21_39.795318
path:
- '**/details_harness|gsm8k|5_2024-01-16T00-21-39.795318.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-16T00-21-39.795318.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_16T00_21_39.795318
path:
- '**/details_harness|hellaswag|10_2024-01-16T00-21-39.795318.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-16T00-21-39.795318.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_16T00_21_39.795318
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T00-21-39.795318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T00-21-39.795318.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T00-21-39.795318.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_16T00_21_39.795318
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T00-21-39.795318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T00-21-39.795318.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_16T00_21_39.795318
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T00-21-39.795318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T00-21-39.795318.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_16T00_21_39.795318
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T00-21-39.795318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T00-21-39.795318.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_16T00_21_39.795318
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T00-21-39.795318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T00-21-39.795318.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_16T00_21_39.795318
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T00-21-39.795318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T00-21-39.795318.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_16T00_21_39.795318
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T00-21-39.795318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T00-21-39.795318.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_16T00_21_39.795318
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T00-21-39.795318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T00-21-39.795318.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_16T00_21_39.795318
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T00-21-39.795318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T00-21-39.795318.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_16T00_21_39.795318
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T00-21-39.795318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T00-21-39.795318.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_16T00_21_39.795318
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T00-21-39.795318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T00-21-39.795318.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_16T00_21_39.795318
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T00-21-39.795318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T00-21-39.795318.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_16T00_21_39.795318
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T00-21-39.795318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T00-21-39.795318.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_16T00_21_39.795318
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T00-21-39.795318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T00-21-39.795318.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_16T00_21_39.795318
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T00-21-39.795318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T00-21-39.795318.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_16T00_21_39.795318
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T00-21-39.795318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T00-21-39.795318.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_16T00_21_39.795318
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T00-21-39.795318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T00-21-39.795318.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_16T00_21_39.795318
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T00-21-39.795318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T00-21-39.795318.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_16T00_21_39.795318
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T00-21-39.795318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T00-21-39.795318.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_16T00_21_39.795318
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T00-21-39.795318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T00-21-39.795318.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_16T00_21_39.795318
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T00-21-39.795318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T00-21-39.795318.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_16T00_21_39.795318
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T00-21-39.795318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T00-21-39.795318.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_16T00_21_39.795318
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T00-21-39.795318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T00-21-39.795318.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_16T00_21_39.795318
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T00-21-39.795318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T00-21-39.795318.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_16T00_21_39.795318
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T00-21-39.795318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T00-21-39.795318.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_16T00_21_39.795318
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T00-21-39.795318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T00-21-39.795318.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_16T00_21_39.795318
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T00-21-39.795318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T00-21-39.795318.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_16T00_21_39.795318
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T00-21-39.795318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T00-21-39.795318.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_16T00_21_39.795318
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T00-21-39.795318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T00-21-39.795318.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_16T00_21_39.795318
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T00-21-39.795318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T00-21-39.795318.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_16T00_21_39.795318
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T00-21-39.795318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T00-21-39.795318.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_16T00_21_39.795318
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T00-21-39.795318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T00-21-39.795318.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_16T00_21_39.795318
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T00-21-39.795318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T00-21-39.795318.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_16T00_21_39.795318
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T00-21-39.795318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T00-21-39.795318.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_16T00_21_39.795318
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T00-21-39.795318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T00-21-39.795318.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_16T00_21_39.795318
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T00-21-39.795318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T00-21-39.795318.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_16T00_21_39.795318
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T00-21-39.795318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T00-21-39.795318.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_16T00_21_39.795318
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T00-21-39.795318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T00-21-39.795318.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_16T00_21_39.795318
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T00-21-39.795318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T00-21-39.795318.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_16T00_21_39.795318
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-16T00-21-39.795318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-16T00-21-39.795318.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_16T00_21_39.795318
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T00-21-39.795318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T00-21-39.795318.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_16T00_21_39.795318
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T00-21-39.795318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T00-21-39.795318.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_16T00_21_39.795318
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T00-21-39.795318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T00-21-39.795318.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_16T00_21_39.795318
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T00-21-39.795318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T00-21-39.795318.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_16T00_21_39.795318
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T00-21-39.795318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T00-21-39.795318.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_16T00_21_39.795318
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T00-21-39.795318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T00-21-39.795318.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_16T00_21_39.795318
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T00-21-39.795318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T00-21-39.795318.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_16T00_21_39.795318
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T00-21-39.795318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T00-21-39.795318.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_16T00_21_39.795318
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T00-21-39.795318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T00-21-39.795318.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_16T00_21_39.795318
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T00-21-39.795318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T00-21-39.795318.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_16T00_21_39.795318
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T00-21-39.795318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T00-21-39.795318.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_16T00_21_39.795318
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T00-21-39.795318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T00-21-39.795318.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_16T00_21_39.795318
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T00-21-39.795318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T00-21-39.795318.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_16T00_21_39.795318
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T00-21-39.795318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T00-21-39.795318.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_16T00_21_39.795318
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T00-21-39.795318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T00-21-39.795318.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_16T00_21_39.795318
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T00-21-39.795318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T00-21-39.795318.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_16T00_21_39.795318
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T00-21-39.795318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T00-21-39.795318.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_16T00_21_39.795318
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T00-21-39.795318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T00-21-39.795318.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_16T00_21_39.795318
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-16T00-21-39.795318.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-16T00-21-39.795318.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_16T00_21_39.795318
path:
- '**/details_harness|winogrande|5_2024-01-16T00-21-39.795318.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-16T00-21-39.795318.parquet'
- config_name: results
data_files:
- split: 2024_01_16T00_21_39.795318
path:
- results_2024-01-16T00-21-39.795318.parquet
- split: latest
path:
- results_2024-01-16T00-21-39.795318.parquet
---
# Dataset Card for Evaluation run of mavihsrr/GetCode-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [mavihsrr/GetCode-slerp](https://huggingface.co/mavihsrr/GetCode-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_mavihsrr__GetCode-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-16T00:21:39.795318](https://huggingface.co/datasets/open-llm-leaderboard/details_mavihsrr__GetCode-slerp/blob/main/results_2024-01-16T00-21-39.795318.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.23210164902672914,
"acc_stderr": 0.029927744191797015,
"acc_norm": 0.23227062322049502,
"acc_norm_stderr": 0.030723532683480458,
"mc1": 0.24357405140758873,
"mc1_stderr": 0.015026354824910782,
"mc2": 0.4977872683391919,
"mc2_stderr": 0.016733548639246566
},
"harness|arc:challenge|25": {
"acc": 0.20392491467576793,
"acc_stderr": 0.011774262478702247,
"acc_norm": 0.26535836177474403,
"acc_norm_stderr": 0.012902554762313964
},
"harness|hellaswag|10": {
"acc": 0.2599083847839076,
"acc_stderr": 0.00437687761923412,
"acc_norm": 0.26199960167297354,
"acc_norm_stderr": 0.004388237557526723
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03355677216313142,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03355677216313142
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21509433962264152,
"acc_stderr": 0.02528839450289137,
"acc_norm": 0.21509433962264152,
"acc_norm_stderr": 0.02528839450289137
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.02094048156533486,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.02094048156533486
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04040610178208841,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04040610178208841
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1774193548387097,
"acc_stderr": 0.02173254068932927,
"acc_norm": 0.1774193548387097,
"acc_norm_stderr": 0.02173254068932927
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15270935960591134,
"acc_stderr": 0.02530890453938063,
"acc_norm": 0.15270935960591134,
"acc_norm_stderr": 0.02530890453938063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860664,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860664
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.20256410256410257,
"acc_stderr": 0.020377660970371372,
"acc_norm": 0.20256410256410257,
"acc_norm_stderr": 0.020377660970371372
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.024882116857655075,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.024882116857655075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1926605504587156,
"acc_stderr": 0.016909276884936094,
"acc_norm": 0.1926605504587156,
"acc_norm_stderr": 0.016909276884936094
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1527777777777778,
"acc_stderr": 0.024536326026134224,
"acc_norm": 0.1527777777777778,
"acc_norm_stderr": 0.024536326026134224
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.31390134529147984,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.31390134529147984,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2905982905982906,
"acc_stderr": 0.02974504857267404,
"acc_norm": 0.2905982905982906,
"acc_norm_stderr": 0.02974504857267404
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23754789272030652,
"acc_stderr": 0.015218733046150193,
"acc_norm": 0.23754789272030652,
"acc_norm_stderr": 0.015218733046150193
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432417,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432417
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.24357405140758873,
"mc1_stderr": 0.015026354824910782,
"mc2": 0.4977872683391919,
"mc2_stderr": 0.016733548639246566
},
"harness|winogrande|5": {
"acc": 0.5177584846093133,
"acc_stderr": 0.014043619596174964
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Hahalol/loginui | ---
dataset_info:
features:
- name: filename
dtype: string
- name: description
dtype: string
splits:
- name: train
num_bytes: 7373
num_examples: 61
download_size: 4423
dataset_size: 7373
---
# Dataset Card for "loginui"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Tverous/demo-amrlib | ---
dataset_info:
features:
- name: uid
dtype: string
- name: article
sequence: string
- name: premise
dtype: string
- name: image
sequence: string
- name: hypothesis
dtype: string
- name: label
dtype: int64
- name: linearized_amr
dtype: string
splits:
- name: train
num_bytes: 47095
num_examples: 6
download_size: 14254
dataset_size: 47095
---
# Dataset Card for "demo-amrlib"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
anjan77/ecommerce-faq-lllama2-dataset | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 38858
num_examples: 158
download_size: 9384
dataset_size: 38858
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: apache-2.0
task_categories:
- text-classification
tags:
- Question Answering
--- |
urvog/transcripts-llama2-1k | ---
license: apache-2.0
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 3658329
num_examples: 1000
download_size: 1410820
dataset_size: 3658329
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
huggingartists/katy-perry | ---
language:
- en
tags:
- huggingartists
- lyrics
---
# Dataset Card for "huggingartists/katy-perry"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [How to use](#how-to-use)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [About](#about)
## Dataset Description
- **Homepage:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Repository:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of the generated dataset:** 0.8409 MB
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://images.genius.com/3f46986b4eb6ceeb06fd9d9166e5a248.900x900x1.jpg')">
</div>
</div>
<a href="https://huggingface.co/huggingartists/katy-perry">
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
</a>
<div style="text-align: center; font-size: 16px; font-weight: 800">Katy Perry</div>
<a href="https://genius.com/artists/katy-perry">
<div style="text-align: center; font-size: 14px;">@katy-perry</div>
</a>
</div>
### Dataset Summary
The Lyrics dataset parsed from Genius. This dataset is designed to generate lyrics with HuggingArtists.
Model is available [here](https://huggingface.co/huggingartists/katy-perry).
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
en
## How to use
How to load this dataset directly with the datasets library:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/katy-perry")
```
## Dataset Structure
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"text": "Look, I was gonna go easy on you\nNot to hurt your feelings\nBut I'm only going to get this one chance\nSomething's wrong, I can feel it..."
}
```
### Data Fields
The data fields are the same among all splits.
- `text`: a `string` feature.
### Data Splits
| train |validation|test|
|------:|---------:|---:|
|609| -| -|
'Train' can be easily divided into 'train' & 'validation' & 'test' with few lines of code:
```python
from datasets import load_dataset, Dataset, DatasetDict
import numpy as np
datasets = load_dataset("huggingartists/katy-perry")
train_percentage = 0.9
validation_percentage = 0.07
test_percentage = 0.03
train, validation, test = np.split(datasets['train']['text'], [int(len(datasets['train']['text'])*train_percentage), int(len(datasets['train']['text'])*(train_percentage + validation_percentage))])
datasets = DatasetDict(
{
'train': Dataset.from_dict({'text': list(train)}),
'validation': Dataset.from_dict({'text': list(validation)}),
'test': Dataset.from_dict({'text': list(test)})
}
)
```
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@InProceedings{huggingartists,
author={Aleksey Korshuk}
year=2021
}
```
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
ahdsoft/math_word_problem | ---
license: mit
---
|
liuyanchen1015/MULTI_VALUE_wnli_null_prepositions | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 10835
num_examples: 60
- name: test
num_bytes: 34504
num_examples: 134
- name: train
num_bytes: 97670
num_examples: 554
download_size: 55123
dataset_size: 143009
---
# Dataset Card for "MULTI_VALUE_wnli_null_prepositions"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AlekseyKorshuk/synthetic-romantic-characters | ---
dataset_info:
features:
- name: name
dtype: string
- name: categories
sequence: string
- name: personalities
sequence: string
- name: description
dtype: string
- name: conversation
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 14989220
num_examples: 5744
download_size: 7896899
dataset_size: 14989220
---
# Dataset Card for "synthetic-romantic-characters"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Prajwal3009/unisys1 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 294074
num_examples: 1267
download_size: 95581
dataset_size: 294074
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
chohi/molit_train_data | ---
dataset_info:
features:
- name: inpurt
dtype: float64
- name: output
dtype: string
- name: instruction
dtype: string
- name: data_source
dtype: string
splits:
- name: train
num_bytes: 3665
num_examples: 10
download_size: 6237
dataset_size: 3665
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_CoolWP__llama-2-13b-guanaco-fp16 | ---
pretty_name: Evaluation run of CoolWP/llama-2-13b-guanaco-fp16
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CoolWP/llama-2-13b-guanaco-fp16](https://huggingface.co/CoolWP/llama-2-13b-guanaco-fp16)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CoolWP__llama-2-13b-guanaco-fp16\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-17T18:49:30.894423](https://huggingface.co/datasets/open-llm-leaderboard/details_CoolWP__llama-2-13b-guanaco-fp16/blob/main/results_2023-08-17T18%3A49%3A30.894423.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5557402565625233,\n\
\ \"acc_stderr\": 0.03433097920024075,\n \"acc_norm\": 0.5600027152011281,\n\
\ \"acc_norm_stderr\": 0.03430992590405376,\n \"mc1\": 0.29865361077111385,\n\
\ \"mc1_stderr\": 0.016021570613768542,\n \"mc2\": 0.43400538092704843,\n\
\ \"mc2_stderr\": 0.014284105671223521\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.552901023890785,\n \"acc_stderr\": 0.014529380160526843,\n\
\ \"acc_norm\": 0.5955631399317406,\n \"acc_norm_stderr\": 0.014342036483436177\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.615116510655248,\n\
\ \"acc_stderr\": 0.004855733568540267,\n \"acc_norm\": 0.8239394542919737,\n\
\ \"acc_norm_stderr\": 0.003800932770597752\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n\
\ \"acc_stderr\": 0.04313531696750574,\n \"acc_norm\": 0.4740740740740741,\n\
\ \"acc_norm_stderr\": 0.04313531696750574\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5263157894736842,\n \"acc_stderr\": 0.04063302731486671,\n\
\ \"acc_norm\": 0.5263157894736842,\n \"acc_norm_stderr\": 0.04063302731486671\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6037735849056604,\n \"acc_stderr\": 0.030102793781791197,\n\
\ \"acc_norm\": 0.6037735849056604,\n \"acc_norm_stderr\": 0.030102793781791197\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5763888888888888,\n\
\ \"acc_stderr\": 0.0413212501972337,\n \"acc_norm\": 0.5763888888888888,\n\
\ \"acc_norm_stderr\": 0.0413212501972337\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n\
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5144508670520231,\n\
\ \"acc_stderr\": 0.03810871630454764,\n \"acc_norm\": 0.5144508670520231,\n\
\ \"acc_norm_stderr\": 0.03810871630454764\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171452,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171452\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.42127659574468085,\n \"acc_stderr\": 0.03227834510146268,\n\
\ \"acc_norm\": 0.42127659574468085,\n \"acc_norm_stderr\": 0.03227834510146268\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3333333333333333,\n \"acc_stderr\": 0.0242785680243077,\n \"acc_norm\"\
: 0.3333333333333333,\n \"acc_norm_stderr\": 0.0242785680243077\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3253968253968254,\n\
\ \"acc_stderr\": 0.04190596438871137,\n \"acc_norm\": 0.3253968253968254,\n\
\ \"acc_norm_stderr\": 0.04190596438871137\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6645161290322581,\n\
\ \"acc_stderr\": 0.02686020644472435,\n \"acc_norm\": 0.6645161290322581,\n\
\ \"acc_norm_stderr\": 0.02686020644472435\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.458128078817734,\n \"acc_stderr\": 0.03505630140785741,\n\
\ \"acc_norm\": 0.458128078817734,\n \"acc_norm_stderr\": 0.03505630140785741\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\"\
: 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.03681050869161551,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03681050869161551\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6919191919191919,\n \"acc_stderr\": 0.032894773300986155,\n \"\
acc_norm\": 0.6919191919191919,\n \"acc_norm_stderr\": 0.032894773300986155\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8031088082901554,\n \"acc_stderr\": 0.028697873971860677,\n\
\ \"acc_norm\": 0.8031088082901554,\n \"acc_norm_stderr\": 0.028697873971860677\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5102564102564102,\n \"acc_stderr\": 0.025345672221942374,\n\
\ \"acc_norm\": 0.5102564102564102,\n \"acc_norm_stderr\": 0.025345672221942374\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.29259259259259257,\n \"acc_stderr\": 0.02773896963217609,\n \
\ \"acc_norm\": 0.29259259259259257,\n \"acc_norm_stderr\": 0.02773896963217609\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5714285714285714,\n \"acc_stderr\": 0.032145368597886394,\n\
\ \"acc_norm\": 0.5714285714285714,\n \"acc_norm_stderr\": 0.032145368597886394\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.037804458505267334,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.037804458505267334\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7486238532110092,\n \"acc_stderr\": 0.018599206360287415,\n \"\
acc_norm\": 0.7486238532110092,\n \"acc_norm_stderr\": 0.018599206360287415\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4444444444444444,\n \"acc_stderr\": 0.03388857118502326,\n \"\
acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.03388857118502326\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7598039215686274,\n \"acc_stderr\": 0.02998373305591362,\n \"\
acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.02998373305591362\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.729957805907173,\n \"acc_stderr\": 0.028900721906293426,\n \
\ \"acc_norm\": 0.729957805907173,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6547085201793722,\n\
\ \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.6547085201793722,\n\
\ \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6106870229007634,\n \"acc_stderr\": 0.04276486542814591,\n\
\ \"acc_norm\": 0.6106870229007634,\n \"acc_norm_stderr\": 0.04276486542814591\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7272727272727273,\n \"acc_stderr\": 0.04065578140908706,\n \"\
acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04065578140908706\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7129629629629629,\n\
\ \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.7129629629629629,\n\
\ \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6932515337423313,\n \"acc_stderr\": 0.03623089915724146,\n\
\ \"acc_norm\": 0.6932515337423313,\n \"acc_norm_stderr\": 0.03623089915724146\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.26785714285714285,\n\
\ \"acc_stderr\": 0.04203277291467762,\n \"acc_norm\": 0.26785714285714285,\n\
\ \"acc_norm_stderr\": 0.04203277291467762\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7991452991452992,\n\
\ \"acc_stderr\": 0.026246772946890474,\n \"acc_norm\": 0.7991452991452992,\n\
\ \"acc_norm_stderr\": 0.026246772946890474\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7586206896551724,\n\
\ \"acc_stderr\": 0.015302380123542108,\n \"acc_norm\": 0.7586206896551724,\n\
\ \"acc_norm_stderr\": 0.015302380123542108\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6329479768786127,\n \"acc_stderr\": 0.02595005433765408,\n\
\ \"acc_norm\": 0.6329479768786127,\n \"acc_norm_stderr\": 0.02595005433765408\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3553072625698324,\n\
\ \"acc_stderr\": 0.01600698993480319,\n \"acc_norm\": 0.3553072625698324,\n\
\ \"acc_norm_stderr\": 0.01600698993480319\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.027914055510468008,\n\
\ \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.027914055510468008\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6205787781350482,\n\
\ \"acc_stderr\": 0.027559949802347813,\n \"acc_norm\": 0.6205787781350482,\n\
\ \"acc_norm_stderr\": 0.027559949802347813\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6358024691358025,\n \"acc_stderr\": 0.026774929899722334,\n\
\ \"acc_norm\": 0.6358024691358025,\n \"acc_norm_stderr\": 0.026774929899722334\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3971631205673759,\n \"acc_stderr\": 0.0291898056735871,\n \
\ \"acc_norm\": 0.3971631205673759,\n \"acc_norm_stderr\": 0.0291898056735871\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41851368970013036,\n\
\ \"acc_stderr\": 0.012599505608336461,\n \"acc_norm\": 0.41851368970013036,\n\
\ \"acc_norm_stderr\": 0.012599505608336461\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5294117647058824,\n \"acc_stderr\": 0.03032024326500413,\n\
\ \"acc_norm\": 0.5294117647058824,\n \"acc_norm_stderr\": 0.03032024326500413\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5408496732026143,\n \"acc_stderr\": 0.020160213617222516,\n \
\ \"acc_norm\": 0.5408496732026143,\n \"acc_norm_stderr\": 0.020160213617222516\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6081632653061224,\n \"acc_stderr\": 0.031251275910891656,\n\
\ \"acc_norm\": 0.6081632653061224,\n \"acc_norm_stderr\": 0.031251275910891656\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7412935323383084,\n\
\ \"acc_stderr\": 0.030965903123573026,\n \"acc_norm\": 0.7412935323383084,\n\
\ \"acc_norm_stderr\": 0.030965903123573026\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366255,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366255\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.43373493975903615,\n\
\ \"acc_stderr\": 0.03858158940685517,\n \"acc_norm\": 0.43373493975903615,\n\
\ \"acc_norm_stderr\": 0.03858158940685517\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7602339181286549,\n \"acc_stderr\": 0.03274485211946956,\n\
\ \"acc_norm\": 0.7602339181286549,\n \"acc_norm_stderr\": 0.03274485211946956\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.29865361077111385,\n\
\ \"mc1_stderr\": 0.016021570613768542,\n \"mc2\": 0.43400538092704843,\n\
\ \"mc2_stderr\": 0.014284105671223521\n }\n}\n```"
repo_url: https://huggingface.co/CoolWP/llama-2-13b-guanaco-fp16
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|arc:challenge|25_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hellaswag|10_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T18:49:30.894423.parquet'
- config_name: results
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- results_2023-08-17T18:49:30.894423.parquet
- split: latest
path:
- results_2023-08-17T18:49:30.894423.parquet
---
# Dataset Card for Evaluation run of CoolWP/llama-2-13b-guanaco-fp16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/CoolWP/llama-2-13b-guanaco-fp16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [CoolWP/llama-2-13b-guanaco-fp16](https://huggingface.co/CoolWP/llama-2-13b-guanaco-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CoolWP__llama-2-13b-guanaco-fp16",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-17T18:49:30.894423](https://huggingface.co/datasets/open-llm-leaderboard/details_CoolWP__llama-2-13b-guanaco-fp16/blob/main/results_2023-08-17T18%3A49%3A30.894423.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5557402565625233,
"acc_stderr": 0.03433097920024075,
"acc_norm": 0.5600027152011281,
"acc_norm_stderr": 0.03430992590405376,
"mc1": 0.29865361077111385,
"mc1_stderr": 0.016021570613768542,
"mc2": 0.43400538092704843,
"mc2_stderr": 0.014284105671223521
},
"harness|arc:challenge|25": {
"acc": 0.552901023890785,
"acc_stderr": 0.014529380160526843,
"acc_norm": 0.5955631399317406,
"acc_norm_stderr": 0.014342036483436177
},
"harness|hellaswag|10": {
"acc": 0.615116510655248,
"acc_stderr": 0.004855733568540267,
"acc_norm": 0.8239394542919737,
"acc_norm_stderr": 0.003800932770597752
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4740740740740741,
"acc_stderr": 0.04313531696750574,
"acc_norm": 0.4740740740740741,
"acc_norm_stderr": 0.04313531696750574
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5263157894736842,
"acc_stderr": 0.04063302731486671,
"acc_norm": 0.5263157894736842,
"acc_norm_stderr": 0.04063302731486671
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6037735849056604,
"acc_stderr": 0.030102793781791197,
"acc_norm": 0.6037735849056604,
"acc_norm_stderr": 0.030102793781791197
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5763888888888888,
"acc_stderr": 0.0413212501972337,
"acc_norm": 0.5763888888888888,
"acc_norm_stderr": 0.0413212501972337
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5144508670520231,
"acc_stderr": 0.03810871630454764,
"acc_norm": 0.5144508670520231,
"acc_norm_stderr": 0.03810871630454764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.04220773659171452,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.04220773659171452
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.42127659574468085,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.42127659574468085,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159394,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159394
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.0242785680243077,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.0242785680243077
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3253968253968254,
"acc_stderr": 0.04190596438871137,
"acc_norm": 0.3253968253968254,
"acc_norm_stderr": 0.04190596438871137
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6645161290322581,
"acc_stderr": 0.02686020644472435,
"acc_norm": 0.6645161290322581,
"acc_norm_stderr": 0.02686020644472435
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.458128078817734,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.458128078817734,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03681050869161551,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03681050869161551
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6919191919191919,
"acc_stderr": 0.032894773300986155,
"acc_norm": 0.6919191919191919,
"acc_norm_stderr": 0.032894773300986155
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8031088082901554,
"acc_stderr": 0.028697873971860677,
"acc_norm": 0.8031088082901554,
"acc_norm_stderr": 0.028697873971860677
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5102564102564102,
"acc_stderr": 0.025345672221942374,
"acc_norm": 0.5102564102564102,
"acc_norm_stderr": 0.025345672221942374
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.29259259259259257,
"acc_stderr": 0.02773896963217609,
"acc_norm": 0.29259259259259257,
"acc_norm_stderr": 0.02773896963217609
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.032145368597886394,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.032145368597886394
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.037804458505267334,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.037804458505267334
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7486238532110092,
"acc_stderr": 0.018599206360287415,
"acc_norm": 0.7486238532110092,
"acc_norm_stderr": 0.018599206360287415
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.03388857118502326,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.03388857118502326
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7598039215686274,
"acc_stderr": 0.02998373305591362,
"acc_norm": 0.7598039215686274,
"acc_norm_stderr": 0.02998373305591362
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.729957805907173,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.729957805907173,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6547085201793722,
"acc_stderr": 0.03191100192835794,
"acc_norm": 0.6547085201793722,
"acc_norm_stderr": 0.03191100192835794
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6106870229007634,
"acc_stderr": 0.04276486542814591,
"acc_norm": 0.6106870229007634,
"acc_norm_stderr": 0.04276486542814591
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04065578140908706,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04065578140908706
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6932515337423313,
"acc_stderr": 0.03623089915724146,
"acc_norm": 0.6932515337423313,
"acc_norm_stderr": 0.03623089915724146
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.26785714285714285,
"acc_stderr": 0.04203277291467762,
"acc_norm": 0.26785714285714285,
"acc_norm_stderr": 0.04203277291467762
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7991452991452992,
"acc_stderr": 0.026246772946890474,
"acc_norm": 0.7991452991452992,
"acc_norm_stderr": 0.026246772946890474
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7586206896551724,
"acc_stderr": 0.015302380123542108,
"acc_norm": 0.7586206896551724,
"acc_norm_stderr": 0.015302380123542108
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6329479768786127,
"acc_stderr": 0.02595005433765408,
"acc_norm": 0.6329479768786127,
"acc_norm_stderr": 0.02595005433765408
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3553072625698324,
"acc_stderr": 0.01600698993480319,
"acc_norm": 0.3553072625698324,
"acc_norm_stderr": 0.01600698993480319
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.027914055510468008,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.027914055510468008
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6205787781350482,
"acc_stderr": 0.027559949802347813,
"acc_norm": 0.6205787781350482,
"acc_norm_stderr": 0.027559949802347813
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6358024691358025,
"acc_stderr": 0.026774929899722334,
"acc_norm": 0.6358024691358025,
"acc_norm_stderr": 0.026774929899722334
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3971631205673759,
"acc_stderr": 0.0291898056735871,
"acc_norm": 0.3971631205673759,
"acc_norm_stderr": 0.0291898056735871
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.41851368970013036,
"acc_stderr": 0.012599505608336461,
"acc_norm": 0.41851368970013036,
"acc_norm_stderr": 0.012599505608336461
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5294117647058824,
"acc_stderr": 0.03032024326500413,
"acc_norm": 0.5294117647058824,
"acc_norm_stderr": 0.03032024326500413
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5408496732026143,
"acc_stderr": 0.020160213617222516,
"acc_norm": 0.5408496732026143,
"acc_norm_stderr": 0.020160213617222516
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6081632653061224,
"acc_stderr": 0.031251275910891656,
"acc_norm": 0.6081632653061224,
"acc_norm_stderr": 0.031251275910891656
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7412935323383084,
"acc_stderr": 0.030965903123573026,
"acc_norm": 0.7412935323383084,
"acc_norm_stderr": 0.030965903123573026
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366255,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366255
},
"harness|hendrycksTest-virology|5": {
"acc": 0.43373493975903615,
"acc_stderr": 0.03858158940685517,
"acc_norm": 0.43373493975903615,
"acc_norm_stderr": 0.03858158940685517
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7602339181286549,
"acc_stderr": 0.03274485211946956,
"acc_norm": 0.7602339181286549,
"acc_norm_stderr": 0.03274485211946956
},
"harness|truthfulqa:mc|0": {
"mc1": 0.29865361077111385,
"mc1_stderr": 0.016021570613768542,
"mc2": 0.43400538092704843,
"mc2_stderr": 0.014284105671223521
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
DarwinAnim8or/DMV-Plate-Review | ---
license: mit
---
# DMV-Plates
This datasets contains various plates and their DMV responses.
Props to avery for making this jsonl file! |
christinacdl/hate_speech_2_classes | ---
license: apache-2.0
---
|
justinwilloughby/mimarchive-bge-large-en-v1.5 | ---
license: mit
---
|
liuyanchen1015/MULTI_VALUE_mrpc_fixin_future | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 40487
num_examples: 143
- name: train
num_bytes: 80602
num_examples: 284
- name: validation
num_bytes: 12091
num_examples: 41
download_size: 96637
dataset_size: 133180
---
# Dataset Card for "MULTI_VALUE_mrpc_fixin_future"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jtatman/textbooks-lite-100k-sharegpt | ---
dataset_info:
features:
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 478190673
num_examples: 113641
download_size: 211268601
dataset_size: 478190673
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.